Job Description
Job Title: Data Engineer
Company: Tanla
Years of Exp: 5–10 Years
Location: Hyderabad
Role Type: Full-Time Role
Salary: Competitive, as per industry standards and experience
Eligibility: Bachelor’s or Master’s degree in Computer Science, Information Systems, or a related field
Role Overview
We are seeking a skilled and experienced Data Engineer to join our dynamic Data Engineering team. The ideal candidate will design, build, and optimize large-scale data pipelines and data warehouses that power real-time analytics and business intelligence across Tanla’s CPaaS ecosystem. You will work with diverse, high-volume datasets such as Call Detail Records (CDR) and collaborate closely with cross-functional teams to ensure data reliability, scalability, and performance.
Key Responsibilities
- Design, develop, and implement scalable ETL/ELT pipelines for large-scale data processing and analytics
- Build and optimize advanced analytical SQL queries with a focus on performance and scalability
- Work with Big Data frameworks including Hadoop, Spark, and Kafka for distributed data processing
- Design, build, and maintain data warehouses and data models to support reporting and analytics
- Efficiently parse, transform, and manage Call Detail Records (CDR) and other telecom datasets
- Collaborate with data scientists, analysts, and business stakeholders to enable seamless data flow
- Ensure data quality, integrity, governance, and security across distributed data systems
- Monitor and optimize data pipelines for reliability and operational excellence
Skills and Qualifications
- 5–10 years of hands-on experience in Data Engineering or related roles
- Strong expertise in Advanced SQL, including analytical queries and performance tuning
- Proven experience in ETL/ELT architecture and data pipeline development
- Hands-on experience with Big Data technologies such as Hadoop, Spark, and Kafka
- Strong knowledge of data modeling and warehouse design (Star and Snowflake schemas)
- Extensive experience with CDR parsing and telecom data transformation
- Proficiency in Python, Scala, or Java for building data workflows
- Experience with orchestration and data platforms such as Airflow, Snowflake, or Redshift
- Familiarity with cloud platforms including AWS, GCP, or Azure
- Exposure to telecom data systems or network analytics is highly preferred
- Strong problem-solving skills and ability to work collaboratively in cross-functional teams