Posted 13d ago (Dec 9, 24)
Data Operations Engineer
At Norstella, our mission is simple: to help our clients bring life-saving therapies to market quickerโand help patients in need.
Founded in 2022, but with history going back to 1939, Norstella unites best-in-class brands to help clients navigate the complexities at each step of the drug development life cycle โand get the right treatments to the right patients at the right time.
Each organization (Citeline, Evaluate, MMIT, Panalgo, The Dedham Group) delivers must-have answers for critical strategic and commercial decision-making. Together, via our market-leading brands, we help our clients:
- Citeline โ accelerate the drug development cycle
- Evaluate โ bring the right drugs to market
- MMIT โ identify barrier to patient access
- Panalgo โ turn data into insight faster
- The Dedham Group โ think strategically for specialty therapeutics
The Role:
We are seeking a skilled Data Operations Engineer to join our team. This role involves managing data operations, including configuring ETL pipelines, executing and monitoring them. The ideal candidate will possess a strong foundation in ETL tools, scripting languages/ platforms and AWS CLI, and a keen ability to troubleshoot, optimize, and data infrastructure at scale. We are hiring for two roles โ a Data Operator and a Lead Data Operations Engineer, each with distinct levels of expertise.
Responsibilities:
- Monitor and maintain Java Spark and Ruby based ETL processes
- Support the configuration and execution of ETL pipelines written in Ruby and Spark (Java)
- Conduct basic automation tasks using Bash and Ruby scripting.
- Perform AWS operations through the command line interface (CLI) for infrastructure management.
- Conduct SQL data analytics tasks to support data-driven decision-making and tools like Athena
- Support basic data engineering tasks and data operations at scale.
Requirements:
- 0-2 years of experience in data operations or related fields.
- Basic knowledge and proficiency in:
- Monitoring and running spark jobs.
- Bash scripting for automation tasks.
- Ruby and Java programming.
- AWS EMR and CLI.
- SQL for data analytics.
- Ability to perform basic tasks in a structured data environment.
- Strong analytical and problem-solving skills.
- Ability to work collaboratively in a fast-paced environment.
- Excellent communication and organizational skills.
- Certification in AWS, Snowflake, or other relevant technologies is a plus.
Benefits
Similar jobs
Hitachi Digital Services
AWS AI ML Engineer
Morning Star (Pitch Book)
Machine Learning Engineer
Oceaneering
Data Engineer
NTT Data
AI Data Engineer
Siemens Energy