Data Flow Engineer, Warsaw (Near site) – EU Public Organisations
At the WhiteTeam Consulting we are looking for a Senior Data Flow Engineer for Frontex in Warsaw, designing large-scale Apache NiFi/Cloudera data flows and Kafka integrations. Come and join us!
Data Flow Engineer, Warsaw (Near site) – EU Public Organisations
Profile: Data Flow Engineer.
Place of performance: Frontex Headquarters (Onsite) – 60% onsite / 40% remote. Candidates must be based within two hours of Warsaw.
Duration of the mission: 48 months.
Minimum level of education: Level 6.
Minimum English language skills (CEFR): B2.
Minimum IT relevant professional experience (years): 8.
Minimum experience at similar position (years): 6.
Rate: The rate offered depends on the candidate’s level, the rate offered depends on the candidate’s level.
Expected NWH: 230 days × 4 years.
Expected EWH: 23 days × 4 years.
Expected On‑Call: 3000 hours × 4 years.
Award Criteria:
50% Price.
50% Quality.
Minimum required scoring for interview: 60%.
Required certificates:
Security Clearance.
At least one of the following certifications:
- Cloudera Certified Developer for Apache NiFi or equivalent certification.
- Cloudera DataFlow (CFM) related certification or equivalent certification.
Equivalent certifications must be internationally recognized and subject to acceptance by the Contracting Authority.
Specific expertise:
Expert knowledge in defining, designing, implementing, and maintaining complex data flows in Apache NiFi (Cloudera DataFlow).
Advanced Python programming skills for data processing, NiFi custom logic, automation, and integrations.
Advanced experience in REST API–based integrations, including authentication (OAuth/JWT), rate limiting, and error handling.
Hands‑on experience in building CDC‑based data flows using native NiFi processors, connectors, and SQL Builder.
Good knowledge of Apache Iceberg (tables, schema evolution, partitioning).
Knowledge of data governance and cataloging in CDP, including:
- Apache Atlas (metadata, lineage, tagging).
- Apache Ranger (authorization, security policies).
Experience with Apache Kafka as messaging backbone (topics, producers/consumers, schema registry, NiFi integration).
Practical knowledge of Apache Avro as serialization standard, including schema evolution and compatibility.
Specific Requirements
Minimum 2–3 years hands‑on daily experience with Apache NiFi, preferably in a Cloudera Data Platform (CDP) environment (design, deployment, monitoring, troubleshooting of advanced flows).
Documented experience delivering at least one large‑scale integration project using NiFi as the central integration tool.
Practical experience with Apache Iceberg in CDP environments (table management, integration with NiFi / Spark / Flink).
Proven experience implementing CDC pipelines to and from relational databases.
Practical knowledge of configuring Apache Atlas and Ranger in the context of NiFi flows (tagging, policies, auditing).
Experience working with Kafka in CDP ecosystems, including schema management with Avro and downstream integrations.
Typical Tasks and Responsibilities
Data Flow Design & Implementation
Design, implement, test, and maintain complex data flows in Cloudera DataFlow (Apache NiFi).
Develop ingestion, transformation, enrichment, routing, and egress pipelines.
Build and optimize real‑time and near‑real‑time CDC pipelines using NiFi, Kafka, and Debezium / SQL CDC connectors.
Integration & Streaming
Integrate external systems using REST APIs, JDBC, Kafka, and other protocols.
Manage and evolve data schemas using Apache Avro.
Ensure reliable delivery to downstream consumers and analytical platforms.
Governance, Security & Operations
Configure and manage metadata, lineage, and governance using Apache Atlas.
Define and maintain security and authorization policies using Apache Ranger.
Monitor, alert, and troubleshoot performance, reliability, and data quality of pipelines.
Collaboration & Documentation
Collaborate with data engineers, architects, and business stakeholders on requirements and data flow architecture.
Create and maintain technical documentation, SOPs, and operational runbooks.
Participate in CDP, NiFi, and Kafka upgrades and migration activities.
Perform other duties as assigned by the team leader.
Travel:
Not foreseen.
Location:
Warsaw (Near site).
- Departamento
- IT
- Puesto
- CONSULTOR/A
- Ubicaciones
- Warsaw
- Estado remoto
- Híbrido
¿Qué ofrecemos?
-
Horarios
TheWhiteam ofrece horarios flexibles. Esto se debe a que buscamos cumplir objetivos, no llegar a una cantidad de horas.
-
Tecnologias
Las tecnologías más punteras, para estar actualizados a los cambios del momento.
-
Modalidad de Trabajo
Dada la situación TheWhiteam da la posibilidad de una modalidad de trabajo presencial, teletrabajo o mixta.
-
Ubicaciones
TheWhiteam da la posibilidad de trabajar en ubicaciones situadas por todo el mundo.
Lugar de trabajo
Formar parte de THEWHITEAM es colaborar con una empresa formada por profesionales con una dilatada experiencia en consultoría tecnológica.
Creemos firmemente que las empresas y clientes marcan el camino a seguir en el sector, pero éste lo construyen las personas. Consideramos de vital importancia que nuestra organización se fundamente en nuestro mejor activo y marca de valor añadido que es nuestro equipo humano.
Acerca de The White Team
Fundada en 2012 por consultores experimentados The Whiteam nace como consultora tecnológica de calidad con una misión clara; ayudar a las compañías de todo el mundo a optimizar su rentabilidad empresarial a través de un uso eficiente de las tecnologías de la información.