Data Engineer

 

Description:


This position is needed to build and maintain highly scalable, reliable, and efficient data pipelines that will empower both inbound and outbound messaging stacks, not to mention other internal engineering solutions, with its data insights interfaces for real-time decision-making.

Responsibilities

In this role, you’ll:
 

  • Oversee the design, construction, testing, and maintenance of advanced, scalable data architectures and pipelines.
  • Drive the development of innovative data solutions that meet complex business requirements.
  • Create and enforce best practices for data architecture, ensuring scalability, reliability, and performance.
  • Provide architectural guidance and mentorship to junior engineers.
  • Tackle the most challenging technical issues and provide advanced troubleshooting support.
  • Collaborate with senior leadership to align data engineering strategies with organizational goals.
  • Participate in long-term planning for data infrastructure and analytics initiatives.
  • Lead cross-functional projects, ensuring timely delivery and alignment with business objectives.
  • Coordinate with product managers, analysts, and other stakeholders to define project requirements and scope.
  • Continuously monitor and enhance the performance of data systems and pipelines.
     

Qualifications

Not all applicants will have skills that match a job description exactly. Twilio values diverse experiences in other industries, and we encourage everyone who meets the required qualifications to apply. While having “desired” qualifications make for a strong candidate, we encourage applicants with alternative experiences to also apply. If your career is just starting or hasn't followed a traditional path, don't let that stop you from considering Twilio. We are always looking for people who will bring something new to the table!

Required
 

  • 3+ years of Java development experience.
  • Experience with Lakehouse technologies, such as Apache Hudi, Apache Iceberg, Databricks Delta Lake.
  • Experience in building AI/ML pipelines.
  • Deep technical understanding of ETL tools, low-latency data stores, multiple data warehouses and data catalogs.
  • Familiarity with data testing and verification tooling and best practices.
  • Experience with cloud services (AWS preferred, Google, Azure etc.)
  • Proficient in working with Key-Value, Streaming, and Search Database technologies, including AWS DynamoDB, Apache Kafka, and ElasticSearch.
  • Readiness to participate in the on-call rotation.

Organization Twilio
Industry Engineering
Occupational Category Data Engineer
Job Location Dublin,Ireland
Shift Type Morning
Job Type Full Time
Gender No Preference
Career Level Experienced Professional
Experience 3 Years
Posted at 2024-08-18 12:26 pm
Expires on 2024-12-01