Job offers
Senior Data Engineer
Published on: 1734652800

 

Project information:

  • Location of office: Warsaw
  • Mode of work: 100% remote
  • Type of employment: B2B contract
  • Rate: up to 180 PLN NET/H
  • Project language: English (min B2)

 

Responsibilities:

  • Designing new solutions and coming up with initiatives for improvements to existing solutions within data platforms - both as part of orders coming from the business (functional changes) and from technology (architectural changes)
  • Development of data platforms and ETL/ELT processes: technical support and active participation in the development of data platforms.

  • Work on building and optimizing ETL/ELT processes that are responsible for processing large data sets.
  • Implement processes to ensure optimal data processing
  • Standardize and streamline technical processes: implementing and optimizing code, test and documentation management standards

  • Selecting and configuring tools and development environments that support data engineering processes to maintain code quality and facilitate code scaling
  • Ensure standards compliance and code review: responsible for applying existing platform development standards and monitoring the quality of delivered solutions

  • Work directly with technology as a Data Engineer and Data Analyst to maintain a high level of technology sophistication, understand current challenges, and drive improvements based on actual technical need

  • Act as a mentor to the team, providing subject matter support in the areas of solution design, code standardization, process optimization and best practice implementation
Closes in 74 days!

Location:

Salary:

up to 180 PLN NET/H

Requirements:

  • Minimum 5 years of experience in designing and building Business Intelligence, ETL/ELT, Data Warehouse, Data Lake, Data Lakehouse, Big Data, OLAP class solutions
  • Practical knowledge of various relational (e.g., SQL Server, Oracle, Redshift, PostgreSQL, Teradata) and non-relational database engines (e.g., MongoDB, Cosmos DB, DynamoDB, Neo4j, HBase, Redis, InfluxDB)
  • Strong proficiency in SQL and Python (minimum 5 years of experience)
  • Familiarity with data engineering and orchestration tools, particularly Spark/Databricks (including structured streaming mechanisms, DLT, etc.), Hadoop/CDP, Azure/Fabric Data Factory, Apache Flink, Apache Kafka, Apache Airflow, dbt, Debezium, and more
  • Understanding of data governance, data quality, and batch/streaming data processing challenges
  • Knowledge of architectural patterns in data, including Data Mesh, Data Vault, Dimensional Modeling, Medallion Architecture, and Lambda/Kappa Architectures
  • Proficiency in using git repositories (Bitbucket, GitHub, GitLab)
  • Experience with data services on the Azure and/or AWS platforms
  • Flexibility, self-reliance, and efficiency, with a strong sense of responsibility for assigned tasks
  • Practical knowledge of English at a minimum B2 level (C1+ preferred)

We offer:

  • B2B contract with salary range up to 180 PLN NET/H
  • 100% remote job
  • Opportunity to be involved in global projects
  • Additional benefits
  • Trainings and certification budget- (Microsoft certifications, AWS, Databricks)
  • Learning time - 60 paid hours per year

 

 

Any questions? Contact

Maria Kozlowska

Recruitment Specialist

Visit Linkedin profile

Explore more

Find out how it is to work with us

Our Clients

We proudly deliver to the leaders across industries.

Our Clients