Companies you'll love to work for

PDE-I Data Engineer

Phenom

Phenom

Data Science
Hyderabad, Telangana, India
Posted on Wednesday, May 17, 2023
Job Requirements

Job Description

Phenom People is looking for bright and motivated Data Engineers to play a key role in building Near Real time Analytics and the next generation Enterprise Data Lake. The ideal candidate will be passionate about building extremely large, scalable Data Lake on cloud and will want to be part of a team that has accepted the goal to provide data driven innovations at Phenom. We are one of fastest growing global Talent Relationship Marketing Cloud Platform with strong technology orientation.


Responsibilities

  • Translate business and functional requirements into robust, scalable solutions that work well within the overall data architecture.
  • Develops and maintains scalable data pipelines and builds new API integrations.
  • Design, develop, implement, test, document, and operate large scale, high volume and low latency applications.
  • Design data integrations and data quality framework.
  • Participate in the full development life cycle, end-to-end, from design, implementation and testing, to documentation, delivery, support, and maintenance
  • Experience in working and delivering end-to-end projects independently.





Work Experience

Must have

  • Excellent knowledge in Python or Java programming.
  • 1+ Years development experience in big data using Spark or Flink
  • Experience with Dimension Modeling, Data Warehousing, and building ETL pipelines
  • Strong expertise in SQL and experience in writing complex SQLs.
  • Knowledge building stream processing platforms using Kafka, Flink Streaming

Good to have

  • Knowledge of using job orchestration frameworks like Airflow, Oozie, Luigi, etc
  • Experience with AWS services such as S3, EMR, RDS.
  • Experience or understanding of Apache Pinot or Apache Doris.
  • Good understanding of cloud data warehouses like Snowflake is an added advantage.
  • Good understanding of SQL distribution engines like Trino, Druid
  • Knowledge of Streaming processing frameworks like Flink etc.
  • Knowledge of NoSQL databases like HBase, Cassandra etc.