Opis:
Data Engineerlokalizacja: Warszawa (mazowieckie)
numer referencyjny: 1195523/jobs.pl
forma zatrudnienia: Pełny etat
- Create and manage ETL workflows that extract, transform, and load data from multiple sources into centralized repositories such as data lakes or warehouses.
- Connect and harmonize data from varied origins databases, APIs, streaming services, and file systems into unified, reliable pipelines.
- Implement solutions that accommodate real-time, batch, and incremental data processing needs.
- Establish validation and quality assurance mechanisms to maintain data accuracy, completeness, and consistency.
- Build and sustain robust pipelines that deliver data efficiently to storage platforms and analytical tools.
- Develop proactive monitoring systems to detect and resolve data issues before they impact downstream processes.
- Refine ETL operations for speed, reliability, and data integrity.
- Deploy and oversee relational, NoSQL, and distributed storage systems tailored to business needs.
- Administer the computing and storage resources necessary for smooth data engineering operations.
- Implement encryption, access controls, and audit trails to ensure data privacy and meet regulatory standards like GDPR, HIPAA, and PCI-DSS.
- Deliver production-grade, testable code that adheres to best practices and handles edge cases gracefully.
- Create thorough unit and integration tests to confirm pipeline functionality and data integrity.
- Continuously explore emerging tools and technologies to enhance the team’s capabilities.
- Work with analysts and stakeholders to translate business needs into scalable engineering solutions.
- Coordinate with other engineering teams to embed data solutions into the wider tech ecosystem.
- Engage in agile ceremonies, share progress updates, and manage interdependencies effectively.
- Bachelor’s or advanced degree in Computer Science, Data Science, or a related field.
- 1-3 years in data engineering, with a focus on building scalable and reliable systems.
- Skilled in Python, Java, or Scala.
- Experienced with big data tools like Hadoop, Spark, and Kafka.
- Familiar with cloud platforms such as AWS and Azure.
- Knowledgeable in ETL and integration tools like Talend, Informatica, or Apache NiFi.
- Strong grasp of data warehousing and lakehouse principles.
- Understanding of CI/CD pipelines and monitoring strategies
- Data engineering certifications (e.g., AWS or Azure) are a plus.
- Proven ability to work across teams and explain technical concepts clearly.
- Willingness to travel to client sites when needed.
- Excellent interpersonal and communication skills.
hays.pl
Hays Poland sp. z o.o. jest agencją zatrudnienia nr 361
25 wrz 2025;
from:
www.jobs.pl