Data Engineer
7 dni temu
We're looking for a Data Engineer ready to push boundaries and grow with us. Datumo specializes in providing Data Engineering and Cloud Computing consulting services to clients from all over the world, primarily in Western Europe, Poland and the USA. Core industries we support include e-commerce , telecommunications and life sciences . Our team consists of exceptional people whose commitment allows us to conduct highly demanding projects. Our team members tend to stick around for more than 3 years, and when a project wraps up, we don't let them go - we embark on a journey to discover exciting new challenges for them. It's not just a workplace; it's a community that grows together Must-have: at least 3 years of commercial experience in programming proven record with a selected cloud provider GCP (preferred), Azure or AWS good knowledge of JVM languages (Scala or Java or Kotlin), Python, SQL experience in one of data warehousing solutions: BigQuery/Snowflake/Databricks or similar in-depth understanding of big data aspects like data storage, modeling , processing , scheduling etc. data modeling and data storage experience ensuring solution quality through automatic tests, CI/CD and code review proven collaboration with businesses English proficiency at B2 level, communicative in Polish Nice to have: knowledge of dbt, Docker and Kubernetes, Apache Kafka familiarity with Apache Airflow or similar pipeline orchestrator another JVM (Java/Scala/Kotlin) programming language experience in Machine Learning projects understanding of Apache Spark or similar distributed data processing framework familiarity with one of BI tools: Power BI/Looker/Tableau willingness to share knowledge (conferences, articles, open-source projects) What's on offer: 100% remote work, with workation opportunity 20 free days onboarding with a dedicated mentor project switching possible after a certain period individual budget for training and conferences benefits: Medicover Private Medical Care , co-financing of the Medicover Sport card opportunity to learn English with a native speaker regular company trips and informal get-togethers Development opportunities in Datumo: participation in industry conferences establishing Datumo's online brand presence support in obtaining certifications (e.g. GCP, Azure, Snowflake) involvement in internal initiatives, like building technological roadmaps training budget access to internal technological training repositories Discover our exemplary project: IoT data ingestion to cloud The project integrates data from edge devices into the cloud using Azure services. The platform supports data streaming via either the IoT Edge environment with Java or Python modules, or direct connection using Kafka protocol to Event Hubs. It also facilitates batch data transmission to ADLS. Data transformation from raw telemetry to structured tables is done through Spark jobs in Databricks or data connections and update policies in Azure Data Explorer. Petabyte-scale data platform migration to Google Cloud The goal of the project is to improve scalability and performance of the data platform by transitioning over a thousand active pipelines to GCP. The main focus is on rearchitecting existing Spark applications to either Cloud Dataproc or Cloud BigQuery SQL, depending on the Client's requirements and automate it using Cloud Composer. Data analytics platform for investing company The project centers on developing and overseeing a data platform for an asset management company focused on ESG investing. Databricks is the central component. The platform, built on Azure cloud, integrates various Azure services for diverse functionalities. The primary task involves implementing and extending complex ETL processes that enrich investment data, using Spark jobs in Scala. Integrations with external data providers, as well as solutions for improving data quality and optimizing cloud resources, have been implemented. Realtime Consumer Data Platform The initiative involves constructing a consumer data platform (CDP) for a major Polish retail company. Datumo actively participates from the project's start, contributing to planning the platform's architecture. The CDP is built on Google Cloud Platform (GCP), utilizing services like Pub/Sub, Dataflow and BigQuery. Open-source tools, including a Kubernetes cluster with Apache Kafka, Apache Airflow and Apache Flink, are used to meet specific requirements. This combination offers significant possibilities for the platform. Recruitment process: Quiz - 15 minutes Soft skills interview - 30 minutes Technical interview - 60 minutes Find out more by visiting our website - https://www.datumo.io If you like what we do and you dream about creating this world with us - don't wait, apply now
-
Cloud Data Engineer
3 dni temu
Lodz, Polska Sii Sp. z o.o. Pełny etatCloud Data Engineer Miejsce pracy: Łódź Technologie, których używamy Wymagane Spark/PySpark Microsoft Azure ETL SQL Python Databricks Mile widziane Snowflake Apache Kafka Apache Airflow O projekcie Chcesz rozwijać swoje kompetencje w technologiach chmurowych? Dołącz do naszej wyspecjalizowanej jednostki skupiającej ekspertów z dziedziny...
-
Data Engineer
7 dni temu
Lodz, Polska Sollers Consulting Pełny etatData Engineer (with Snowflake/Databricks) Miejsce pracy: Łódź Technologies we use Expected Azure AWS SQL Python About the project Versatility is the keyword when it comes to what we do in IT. Data Competency is one of our initiatives to support digital transformation of financial sector, enabling our customers to become truly data-driven companies. Our...
-
Data Engineer
7 dni temu
Lodz, Polska Incubly Pełny etatHey! Nice to see you! Let's share our stories to get to know a bit better.... We are business and technology enthusiasts constantly hungry for new challenges, self-development, and development, and nothing motivates us more than great software products and happy customers. At Incubly , we believe that great people want to work with great people, so we...
-
Senior Data Engineer with AWS and Snowflake
3 tygodni temu
Lodz, Polska Sii Sp. z o.o. Pełny etatSenior Data Engineer with AWS and Snowflake Miejsce pracy: Łódź Technologies we use Expected Snowflake Data Vault Python SQL About the project We are looking for a Data Engineer with expertise in Snowflake, AWS, and ETL processes, who will work closely with AI scientists and data analysts to design, develop, and maintain data pipelines and systems that...
-
Cloud Data Engineer – sektor finansowy
2 tygodni temu
Lodz, Polska Sii Pełny etatDołącz do projektu realizowanego w obszarze płatności jako Data Engineer. Zespół odpowiada za tworzenie, rozwijanie i utrzymanie globalnej platformy danych, integrującej informacje o płatnościach i terminalach z różnych systemów źródłowych na całym świecie. Specjaliści zajmują się ekstrakcją, transformacją oraz dostarczaniem danych...
-
Data Engineer
7 dni temu
Lodz, Polska CLOUDFIDE Pełny etatYou are Passionate about Cloud and data analytics. Curious and eager to learn new technologies. One that would like to work with a team of like-minded people. Opportunity overview You will work on a project involving modern cloud data lake implementation, leveraging Databricks, CI/CD and cloud services as your daily driver. Your impact zone Implementing, and...
-
Senior Data Engineer
3 tygodni temu
Lodz, Polska TRANSITION TECHNOLOGIES PSC S.A Pełny etatSenior Data Engineer (with Snowflake) Miejsce pracy: Łódź Technologies we use Expected Python Snowflake Data Cloud Amazon AWS SQL Your responsibilities Design, build, and maintain scalable data pipelines (ETL/ELT) leveraging Snowflake and Airflow Implement optimized schemas, partitioning, and indexing strategies in Snowflake and relational databases...
-
Lodz, Polska Sii Sp. z o.o. Pełny etatData Engineer with Snowflake and DBT – pharmaceutical industry Miejsce pracy: Łódź Technologies we use Expected Data Built Tool Snowflake ETL SQL Talend Google Cloud Platform Optional Python Microsoft Azure AWS About the project We are looking for a Data Engineer to join our project for one of the largest pharmaceutical companies. In this role, you will...
-
Backend Data Engineer DevOps
1 tydzień temu
Lodz, Polska DeepSee.ai Pełny etatAt Deepsee.ai , we're redefining how financial data is processed, reconciled, and delivered at scale. Our mission is to empower organizations with fully automated, cloud-native data systems that ensure accuracy, transparency, and performance across every financial workflow. We build intelligent ingestion, transformation, and reconciliation pipelines that...
-
Spotfire Engineer
19 godzin temu
Lodz, Polska Sii Sp. z o.o. Pełny etatSpotfire Engineer Miejsce pracy: Łódź Technologies we use Expected Spotfire SQL IronPython HTML CSS Java AWS About the project Join a project for a global medical organization driving a major GxP end-to-end analytics initiative. As a Senior Spotfire Engineer, you will design and deliver advanced Spotfire dashboards and data models that empower business...