As a Senior Data Ops Engineer at Luxoft, you will play a critical role in developing a scalable data collection, storage, and distribution platform. The platform will handle data from a variety of sources such as vendors, research providers, exchanges, prime brokers (PBs), and web scraping. You will be responsible for making this data available to various teams, including systematic & fundamental portfolio managers (PMs), and enterprise functions like Operations, Risk, Trading, and Compliance. You will also contribute to building internal data products and analytics that drive decision-making.
Responsibilities
- Develop and manage web scraping processes using scripts, APIs, and tools to collect and ingest data from multiple sources (vendors, exchanges, PBs, and web scraping).
- Contribute to the development and maintenance of a greenfield data platform running on Snowflake and AWS.
- Ensure the platform is scalable and can handle growing data volumes and complex data pipelines.
- Analyze and understand existing data pipelines and enhance them to accommodate new data requirements.
- Continuously improve the platform's efficiency and functionality.
- Onboard new data providers to the platform, ensuring seamless integration of new data sources.
- Lead or assist with data migration projects, ensuring data is transferred securely and efficiently between systems.
- Collaborate with various teams, including data engineers, data scientists, and business stakeholders.
- Use strong communication skills to ensure clear understanding and alignment on requirements and deliverables.
Mandatory Skills
- SQL. Strong proficiency in SQL for querying, analyzing, and manipulating data in relational databases.
- Python. Proficient in Python for developing data ingestion scripts, automation tasks, and data pipeline enhancements.
- Linux. Good knowledge of Linux systems for managing environments and handling infrastructure.
- Containerization. Experience with containerization technologies like Docker and Kubernetes for application deployment and management.
- AWS. Strong understanding and experience with AWS (Amazon Web Services) for cloud-based platform development and management.
- DevOps. Solid experience with DevOps practices and tools like Kubernetes (K8s), Docker, and Jenkins to ensure smooth deployment, monitoring, and automation of data pipelines.
- Communication. Excellent verbal and written communication skills, with the ability to explain complex technical concepts to non-technical stakeholders and collaborate with cross-functional teams.
Nice-to-Have Skills
- Market Data/Capital Markets Experience. Familiarity with market data projects or experience working in the capital markets domain will be a significant advantage.
- Airflow. Experience with Apache Airflow for managing and scheduling complex data workflows and pipelines.
Languages
- English. Upper-intermediate (B2) proficiency in English, both spoken and written.
Preferred Candidate Profile
- Experience. At least 5+ years of experience in Data Engineering, DevOps, or Data Ops roles.
- Technical Expertise. Proficient in modern data engineering practices, cloud technologies, and containerized environments.
- Problem Solving. Strong analytical and troubleshooting skills to handle data-related issues efficiently.
- Team Player. Ability to collaborate effectively with different teams to deliver data-driven solutions.
Additional Information
- This is a remote position, which means you can work from anywhere within India.
- Full-time role with opportunities for professional growth and exposure to cutting-edge technologies in the data domain.
How to Apply. If you have a strong background in Data Engineering, DevOps, and experience with AWS, Python, SQL, and containerization technologies, we invite you to apply for this exciting opportunity at Luxoft.