AWS Redshift Data Engineer

Pune, Maharashtra, India
Dec 27, 2024
Dec 20, 2025
Onsite
Full-Time
3 Years
Job Description

As an AWS Redshift Data Engineer, you will be responsible for designing, building, and managing data pipelines in the AWS ecosystem, with a focus on AWS Redshift, AWS RDS, and AWS S3. You will work closely with the client’s team to design and implement data solutions that are scalable, reliable, and aligned with the business goals. Your role will also involve troubleshooting, testing, and optimizing data systems, as well as contributing to CI/CD processes to streamline development workflows.

Key Responsibilities

  1. Data Pipeline Development. Design and implement ETL/ELT pipelines in AWS Redshift, AWS S3, AWS Lambda, and other AWS services to automate data flows and integrate data from diverse sources.
  2. Cloud Services Management. Work with various AWS cloud services like AWS EC2, AWS RDS, AWS API Gateway, and AWS CodePipeline to build and maintain cloud-based data architectures.
  3. Data Modeling. Collaborate with the client team to design and implement optimized data models and schema in AWS Redshift, ensuring alignment with business needs and performance requirements.
  4. SQL & Python Programming. Develop complex SQL queries and use Python for data transformations, data validation, and automation of data tasks within the AWS cloud environment.
  5. CI/CD Implementation. Contribute to the CI/CD pipeline development, leveraging GitHub and JIRA for version control and project management to ensure smooth code deployment and updates.
  6. Data Integration. Handle API integrations to connect external systems with AWS-based data services, ensuring seamless data flow across platforms.
  7. Testing & Debugging. Write unit tests and perform debugging to ensure the accuracy and efficiency of pipelines, resolving any issues found during the testing phase and supporting production deployment.
  8. Documentation & Reporting. Update technical documentation and provide clear status reports on your tasks, ensuring alignment with project timelines and client requirements.
  9. Agile Methodology. Work in an agile environment, adapting to evolving requirements and collaborating effectively with cross-functional teams to meet delivery timelines.

Required Skills & Qualifications

  1. Experience. 3-5 years of hands-on experience as a Data Engineer working with AWS cloud services, specifically AWS Redshift, AWS S3, AWS Lambda, and AWS RDS.
  2. Programming Skills. Strong proficiency in SQL (for querying databases and building data models) and Python (for data transformations and automation).
  3. Cloud Services. In-depth experience with AWS Redshift, AWS Lambda, AWS S3, AWS EC2, AWS CodePipeline, AWS RDS, and AWS API Gateway.
  4. CI/CD Experience. Familiarity with CI/CD pipelines, version control systems like GitHub, and project management tools like JIRA.
  5. Data Engineering Knowledge. Experience in building ETL/ELT pipelines and working with data warehouses in AWS, handling data integration from various sources.
  6. Data Modeling & Architecture. Experience in data modeling and building robust data architectures to support high-performance data analytics and reporting.
  7. Problem Solving. Excellent analytical and problem-solving skills, with the ability to troubleshoot and optimize complex data systems.
  8. Communication Skills. Proficiency in English, both written and verbal, for effective communication with teams and stakeholders. Good documentation practices for maintaining clear and consistent project updates.
  9. Agile Mindset. Comfort with agile workflows, including the ability to adapt to evolving requirements and deliver solutions iteratively.

Preferred Qualifications

  1. Big Data Technologies. Familiarity with Big Data technologies (e.g., Hadoop, Spark) is a plus.
  2. Data Governance. Understanding of data governance practices, especially related to compliance, privacy, and security in the cloud.
  3. Experience with AWS Glue. Exposure to AWS Glue for ETL management and data transformation workflows.

What We Offer

  1. Competitive Salary. Attractive salary package based on skills and experience.
  2. Growth & Development. Opportunities for career growth in cloud data engineering and access to the latest cloud technologies and AWS certifications.
  3. Dynamic Environment. Work with a global team in an innovative environment where you can make a real impact with the work you do.
  4. Learning Opportunities. Continuous learning with exposure to new AWS services, tools, and evolving cloud technologies.
  5. Work-Life Balance. Flexible working hours and the ability to work in a collaborative, supportive atmosphere.

How to Apply. If you are passionate about AWS cloud services and data engineering and want to work on innovative, cutting-edge cloud solutions, we would love to hear from you! Click Easy Apply or share your resume with us for consideration.