All jobs / Data Engineering

Posted 13d ago (Dec 9, 24)

Norstella logoNorstellaYardley, PA

Data Operations Engineer

๐Ÿ‡ฎ๐Ÿ‡ณ IndiaFull-timeJava SparkRubyBashAWS CLISQLAthena

At Norstella, our mission is simple: to help our clients bring life-saving therapies to market quickerโ€”and help patients in need.

Founded in 2022, but with history going back to 1939, Norstella unites best-in-class brands to help clients navigate the complexities at each step of the drug development life cycle โ€”and get the right treatments to the right patients at the right time.

Each organization (Citeline, Evaluate, MMIT, Panalgo, The Dedham Group) delivers must-have answers for critical strategic and commercial decision-making. Together, via our market-leading brands, we help our clients:

  • Citeline โ€“ accelerate the drug development cycle
  • Evaluate โ€“ bring the right drugs to market
  • MMIT โ€“ identify barrier to patient access
  • Panalgo โ€“ turn data into insight faster
  • The Dedham Group โ€“ think strategically for specialty therapeutics

The Role:

We are seeking a skilled Data Operations Engineer to join our team. This role involves managing data operations, including configuring ETL pipelines, executing and monitoring them. The ideal candidate will possess a strong foundation in ETL tools, scripting languages/ platforms and AWS CLI, and a keen ability to troubleshoot, optimize, and data infrastructure at scale. We are hiring for two roles โ€“ a Data Operator and a Lead Data Operations Engineer, each with distinct levels of expertise.

Responsibilities:

  • Monitor and maintain Java Spark and Ruby based ETL processes
  • Support the configuration and execution of ETL pipelines written in Ruby and Spark (Java)
  • Conduct basic automation tasks using Bash and Ruby scripting.
  • Perform AWS operations through the command line interface (CLI) for infrastructure management.
  • Conduct SQL data analytics tasks to support data-driven decision-making and tools like Athena
  • Support basic data engineering tasks and data operations at scale.

Requirements:

  • 0-2 years of experience in data operations or related fields.
  • Basic knowledge and proficiency in:
    • Monitoring and running spark jobs.
    • Bash scripting for automation tasks.
    • Ruby and Java programming.
    • AWS EMR and CLI.
    • SQL for data analytics.
  • Ability to perform basic tasks in a structured data environment.
  • Strong analytical and problem-solving skills.
  • Ability to work collaboratively in a fast-paced environment.
  • Excellent communication and organizational skills.
  • Certification in AWS, Snowflake, or other relevant technologies is a plus.

Benefits

Health InsuranceProvident FundReimbursement of Certification ExpensesGratuity24x7 Health Desk
Apply for this position
Any feedback or want to report a concern?Help us maintain the quality of jobs posted on Nata in Data!Contact us