Senior Data Engineer

Bengaluru, Karnataka, India
Oct 07, 2024
Oct 07, 2025
Onsite
Full-Time
6 Years
Job Description

We are seeking a Senior Data Engineer who will be responsible for creating and maintaining the technical infrastructure needed for processing large amounts of data efficiently. You will collaborate closely with Data Scientists by providing structured, well-organized data to enable insights, predictions, and analytics that power innovative tools and solutions.

Your key responsibilities include developing data models, building data pipelines, managing the retrieval, storage, and distribution of data, and working on cutting-edge cloud technologies and modern data platforms. This role will provide you the opportunity to work with a range of technologies including cloud data warehouses, containerized applications, and data orchestration tools.

Experience. 6 - 9 Years

Key Responsibilities

  • Design, develop, and maintain data models and pipelines to process and deliver data in optimal formats.
  • Collaborate with Data Scientists to provide structured data for analysis and tool development.
  • Manage data retrieval, storage, and distribution efficiently across the organization.
  • Develop and implement Enterprise Data Platforms to meet business needs.
  • Build and maintain orchestration/data pipelines using tools such as Airflow, Argo Workflow, dbt, or AWS Lambda/Step Functions, AWS Glue.
  • Work extensively with RDBMS databases like Postgres, MySQL, Oracle, SQL Server, and cloud data warehouses like Snowflake, Redshift, Synapse Analytics, or BigQuery.
  • Utilize technologies like Pig, Hive, Spark, and demonstrate proficiency in SQL.
  • Leverage knowledge of Java or Python for backend development.
  • Develop containerized applications and microservices with a solid understanding of Kubernetes.
  • Work with cloud platforms such as AWS, Azure, or GCP to ensure robust data management.
  • Integrate CI/CD pipelines using tools like ArgoCD.

Preferred Skills & Qualifications

  • Data Warehousing, Data Modelling, and Data Analysis expertise.
  • Hands-on experience with building Enterprise Data Platforms.
  • Familiarity with data orchestration tools such as Airflow or dbt.
  • Experience with Cloud technologies (AWS, Azure, GCP).
  • Knowledge of Containerization and Microservices deployment using Kubernetes.
  • Python programming experience is highly desirable.
  • Knowledge of Continuous Deployment tools such as ArgoCD is a plus.

What We Offer

  1. Exciting Projects. Be a part of projects in industries like high-tech, communication, media, healthcare, retail, and telecom. Collaborate with global brands and leaders on impactful products and solutions.
  2. Collaborative Environment. Grow your skills by working with diverse, highly talented teams in a dynamic and open environment.
  3. Work-Life Balance. Flexible work schedules, options to work from home, paid time off, and holidays to ensure a healthy work-life balance.
  4. Professional Development. Regular training programs for soft and technical skills, certifications, and stress management sessions.
  5. Competitive Benefits. Attractive compensation, family medical insurance, life insurance, NPS (National Pension Scheme), extended maternity leave, annual performance bonuses, and referral bonuses.
  6. Fun Perks. Participate in sports events, cultural activities, and corporate parties. Enjoy food subsidies, GL Zones, and discounts at popular stores and restaurants.

About GlobalLogic

GlobalLogic is a Hitachi Group Company and a leader in digital engineering. We help companies across industries design and build innovative digital products, platforms, and experiences for the modern world. With a presence across the globe, we combine experience design, complex engineering, and data expertise to help clients accelerate their journey into the future of business.

Related Jobs