Description:
Our client, a leader in the financial services industry, is seeking an experienced PySpark Data Engineer with strong expertise in the Asset Management sector. The role is based in Dublin and offers an excellent opportunity for a permanent position within a dynamic team.
Job Responsibilities:
- Design, implement, and optimize PySpark data pipelines for large-scale data processing.
- Collaborate with data scientists and analysts to support data-driven decision-making processes.
- Ensure the integrity and accuracy of data throughout the data lifecycle.
- Develop and maintain data models and schemas.
- Monitor and troubleshoot data pipeline performance and reliability issues.
- Work closely with stakeholders to understand data requirements and translate them into technical solutions.
- Maintain up-to-date knowledge of industry trends and advancements in big data technologies.
Experience Required:
- Minimum 5 years of experience as a Data Engineer with a focus on PySpark.
- Proven experience in the Asset Management industry.
- Strong understanding of data warehousing concepts and data modeling techniques.
- Proficiency in SQL and experience with relational databases.
- Familiarity with cloud platforms (AWS, Azure, or Google Cloud).
- Excellent problem-solving and analytical skills.
- Strong communication skills and ability to work collaboratively with cross-functional teams.