Senior Azure Data Engineer with Databricks
4 tygodni temu
Senior Azure Data Engineer with Databricks Miejsce pracy: Gdańsk Technologie, których używamy System operacyjny Windows macOS O projekcie We are seeking an experienced and self-driven Data Engineer to design, build, and maintain the modern data infrastructure that supports our analytics, data science, and product teams. In this role, you will develop scalable data pipelines and manage large, complex datasets using Azure Databricks and the broader Azure data ecosystem. Working closely with engineers, analysts, and data scientists, you will ensure that data is reliable, accessible, and optimized for insight and decision-making across the organization. This is an exciting opportunity to shape and scale our data platform and contribute to the next generation of data-driven. Twój zakres obowiązków curating structured, semi-structured, and unstructured data by creating efficient, cost effective and scalable pipelines leveraging Databricks and Azure Analytics stack, creating and maintaining robust data pipeline architecture, ensuring data quality, reliability, and scalability, assembling and managing large, complex data sets to meet functional and non-functional business requirements, identifying, designing, and implementing process improvements, including automation of manual processes and optimization of data delivery, mentoring and coaching the team members to achieve engineering excellence, working with architects to design robust, scalable and cost-effective data solutions. Nasze wymagania min. 6 years in a Data Engineer role, with a degree in Computer Science, Statistics, Informatics, Information Systems, or a related field (solid software engineering background is a must), exposure to integration projects or streaming analytics projects, strong experience with Databricks and Data Factory, including recent features available in the tool, experience on Microsoft Azure analytics stack, as well as real-time analytics tools, advanced proficiency in SQL and Python and familiarity with various databases, experience in writing unit/ performance tests, working knowledge of Git, solid software engineering and DataSecOps background, including CI/CD tools expertise, understanding of data modelling techniques for analytics, awareness of data management frameworks (e.g., DAMA) and exposure to data quality and catalogue tools such as Collibra or Alation will be highly rewarded, API development, as well as optimised strategies to serve real-time data, od verbal and written communication skills in Polish and English. Work from the European Union region and a work permit are required. Mile widziane openness to work between 7-8 a.m. and 3-4 p.m. CET. Tak organizujemy naszą pracę Tak pracujemy u klienta koncentrujesz się na jednym projekcie Skład zespołu big data developer Takie dajemy możliwości rozwoju budżet rozwojowy Benefity dofinansowanie zajęć sportowych prywatna opieka medyczna dofinansowanie nauki języków dofinansowanie szkoleń i kursów ubezpieczenie na życie elastyczny czas pracy spotkania integracyjne brak dress code'u gry wideo w pracy kawa / herbata napoje parking dla pracowników strefa relaksu program rekomendacji pracowników Etapy rekrutacji CV review HR call Interview (with Live-coding) Client Interview (with Live-coding) Hiring Manager Interview Decision Xebia sp. z o.o. Who We Are While Xebia is a global tech company, our journey in CEE started with two Polish companies – PGS Software, known for world-class cloud and software solutions, and GetInData, a pioneer in Big Data. Today, we're a team of 1,000 experts delivering top-notch work across cloud, data, and software. And we're just getting started. What We Do We work on projects that matter – and that make a difference. From fintech and e-commerce to aviation, logistics, media, and fashion, we help our clients build scalable platforms, data-driven solutions, and next-gen apps using ML, LLMs, and Generative AI. Our clients include Spotify, Disney, ING, UPS, Tesco, Truecaller, AllSaints, Volotea, Schmitz Cargobull, and Allegro or InPost. We value smart tech, real ownership, and continuous growth. We use modern, open-source stacks, and we're proud to be trusted partners of Databricks, dbt, Snowflake, Azure, GCP, and AWS. Fun fact: we were the first AWS Premier Partner in Poland Beyond Projects What makes Xebia special? Our community. We run events like the Data&AI Warsaw Summit, organize meetups (Software Talks, Data Tech Talks), and have a culture that actively support your growth via Guilds, Labs, and personal development budgets — for both tech and soft skills. It's not just a job. It's a place to grow. What sets us apart? Our mindset. Our vibe. Our people. And while that's hard to capture in text – come visit us and see for yourself. Administratorem Danych Osobowych jest Xebia sp. z o.o. z siedzibą we Wrocławiu przy ul. Suchej 3. Twoje dane zawarte w CV będziemy przetwarzać wyłącznie w celu prowadzenia procesu rekrutacji. Podstawą prawną przetwarzania Twoich danych jest art. 221 ust. 1 kodeksu pracy. Twoje dane osobowe będziemy przetwarzać do czasu zakończenia rekrutacji. Jeśli wyrazisz nam odrębną zgodę, będziemy przetwarzać Twoje dane również w procesie przyszłych rekrutacji. Masz prawo dostępu do treści swoich danych osobowych oraz prawo ich sprostowania, usunięcia, ograniczenia przetwarzania, prawo do przenoszenia danych, prawo wniesienia sprzeciwu, prawo do cofnięcia zgody na ich przetwarzanie w dowolnym momencie bez wpływu na zgodność z prawem przetwarzania, którego dokonano na podstawie zgody przed jej cofnięciem. W celu realizacji wymienionych praw należy przesłać e-mail z wybranym żądaniem na adres: gdpr.pl@xebia.com. Jeżeli uważasz, że przetwarzamy Twoje dane niezgodnie z prawem, możesz złożyć w tej sprawie skargę do organu nadzorczego z siedzibą przy ul. Stawki 2 w Warszawie. Twoje dane możemy udostępniać tylko wtedy kiedy wyrazisz na to zgodę lub podmiotom uprawnionym, o ile zaistnieje taka konieczność. Twoje dane nie będą podlegały zautomatyzowanemu przetwarzaniu.
-
Senior Azure Data Engineer
8 godzin temu
Gdansk, Polska Capgemini Polska Pełny etatSenior Azure Data Engineer Miejsce pracy: Gdańsk Technologies we use Expected Python SQL Data Lake Gen2 Event Hub Data Factory DataBricks Azure DWH API Azure Azure Function Power BI Operating system Linux About the project Insights & Data delivers state-of-the-art Data solutions. Our expertise primarily lies in Cloud & Big Data engineering, where we develop...
-
Data Engineer
4 dni temu
Gdansk, Polska Sollers Consulting Pełny etatData Engineer (with Snowflake/Databricks) Miejsce pracy: Gdańsk Technologies we use Expected Azure AWS SQL Python About the project Versatility is the keyword when it comes to what we do in IT. Data Competency is one of our initiatives to support digital transformation of financial sector, enabling our customers to become truly data-driven companies. Our...
-
DevOps Engineer
8 godzin temu
Gdansk, Polska ERGO Technology & Services S.A. Pełny etatDevOps Engineer (Databricks) Miejsce pracy: Gdańsk Your responsibilities managing deployment automation of Azure PaaS/SaaS services within the context of a Big Data & Analytics platform carrying out automation of configuration validation and re-establishment to comply with IT security requirements, and ensuring platform stability implementing end-to-end...
-
Senior Data Engineer
4 tygodni temu
Gdansk, Polska ERGO Technology & Services S.A. Pełny etatSenior Data Engineer Miejsce pracy: Gdańsk Your responsibilities designing and implementing data models and architectures that support financial data management developing and maintaining ETL processes to integrate data from diverse sources, including market data feeds, customer information systems, and transactional databases ensuring data quality,...
-
Data Engineer
4 tygodni temu
Gdansk, Polska Kyotu Technology Pełny etatData Engineer Location: Poland / 100% remote Contract: B2B Capacity: Full-time Who we are Kyotu Technology is a boutique software house based in Wrocław and Warsaw, working fully remotely or in hybrid mode from anywhere in Poland. We partner with companies from Germany, Switzerland, Western Europe, the United States, and the Middle East, focusing on...
-
Data Engineer
4 tygodni temu
Gdansk, Polska ERGO Technology & Services S.A. Pełny etatData Engineer Miejsce pracy: Gdańsk Your responsibilities designing and implementing data models and architectures that support financial data management developing and maintaining ETL processes to integrate data from diverse sources, including market data feeds, customer information systems, and transactional databases ensuring data quality, security, and...
-
Databricks Tech Lead
8 godzin temu
Gdansk, Polska KPMG Pełny etatDatabricks Tech Lead - Zespół Data & Cloud Miejsce pracy: Gdańsk Technologie, których używamy Wymagane Databricks Data Lake Azure Data Factory Azure DevOps CI/CD Python SQL BI O projekcie Zespół Data & Cloud zajmuje się dostarczaniem naszym Klientom usług z zakresu szeroko pojętej analityki danych, modelowania platform danych i Business...
-
Senior DevOps Engineer
1 tydzień temu
Gdansk, Polska KUBO Pełny etatFor our client – a global technology company building and operating large-scale data and analytics platforms – we are looking for a Senior DevOps Engineer. You'll join an international team supporting cloud-based solutions used across multiple business areas. Your focus will be on improving reliability, automation, and deployment processes within a...
-
Data Engineer SQL Specialist
3 tygodni temu
Gdansk, Polska emagine Polska Pełny etatStart : ASAP Contract : B2B 12 months prologantions Work mode : 100% remote Project language: English Recruitment process: 1 screening 2 interviews Summary: The primary objective of the Data Engineer / SQL Specialist role is to enhance the organization's analytical capabilities by developing and managing SQL-based data solutions on the Databricks platform....
-
Azure Data Tech Lead
4 dni temu
Gdansk, Polska Link Group Pełny etatZakres obowiązków: Projektowanie i rozwój architektury danych w środowisku Azure i Databricks . Tworzenie i optymalizacja pipeline'ów danych przy użyciu Python i SQL . Nadzór nad jakością danych, integracją i przepływem informacji między systemami. Podejmowanie decyzji technicznych dotyczących narzędzi, procesów i sposobu przetwarzania danych....