PSS is mandated to hire a Big Data Engineer for our client, a fast-growing financial services company with lending, asset management and investment advisory services. 

Summary of the role:

A candidate in this role will have 5-8 years of experience and knowledge about the different aspects of a data engineering role including extraction, processing (batch and streaming), transforming and loading data for data warehousing, analysis and data analytics to be able to transfer raw data into a format that can be analyzed and used for decision making. The analysis and insights derived from this data will be utilized toward designing, improving and delivering better products and services to our client's customers. 

Key Responsibilities:

  • Develop data pipelines: Ingest, transform and load data from different sources including database, streams and applications through APIs. This would including real-time handling of data as well.
  • Design & implement an architecture for big data: Data storage (Data lakes, Distributed systems (Hadoop, Spark) or Cloud (AWS, Azure). 
  • Develop a data processing workflow: using platforms like Apache, Spark, Kafka or Flink.
  • Oversee aspects of data security and governance: Ensure that all data is managed, stored, analyzed and disclosed as per global security standards like GDPR or SOC-2.

Experience and Skills Required:

  • The candidate must have strong coding skills and experience in programming in languages like Python, Java or Scala.
  • Knowledge of developing data pipelines using frameworks/ platforms like Hadoop, Hive or Flink.
  • Experience and expertise in storing different types of structure and unstructured data in databases like HDFS, NoSQL (MongoDB, Cassandra), data warehouses and data lakes
  • Knowledge of data security and governance standards like GDPR. 
  • Knowledge of data ingestion streaming services like Kafka, Spark, Kinesis, and Flume
  • Knowledge of cloud platforms like AWS (S3, EMR), Google Cloud (Big Query), Microsoft Azure.
  • Experience and expertise in using devops tools like Docker, Kubernetes, Terraform, CI/CD pipelines
  • Stakeholder management and collaboration: Ability to work through the technical challenges in big data acquisition and organization by collaborating with other stakeholders including data scientists, data analysts and business managers.

  Job Summary

Posted On:

05-Sep-2024

Function:

Technology - Infrastructure, Database & Cloud

Industry:

Banking, Microfinance & NBFC

Location:

New York

Employment Type:

Full Time