Aktualne oferty pracy związane z Data Engineer - Poznań wielkopolskie, wielkopolskie - Allegro
-
Data Engineer
4 tygodni temu
Poznań, wielkopolskie, wielkopolskie, Polska Capgemini Polska Pełny etatData Engineer (GCP)Miejsce pracy: PoznańTechnologies we useOperating systemWindowsLinuxAbout the projectChoosing Capgemini means choosing a company where you will be empowered to shape your career in the way you’d like, where you’ll be supported and inspired by a collaborative community of colleagues around the world, and where you’ll be able to...
-
Data Engineer
3 tygodni temu
Poznań, wielkopolskie, wielkopolskie, Polska Sollers Consulting Pełny etatData Engineer (with Snowflake/Databricks)Miejsce pracy: PoznańTechnologies we useExpectedAzureAWSSQLPythonAbout the projectVersatility is the keyword when it comes to what we do in IT. Data Competency is one of our initiatives to support digital transformation of financial sector, enabling our customers to become truly data-driven companies. Our mission is...
-
ML Engineer
3 dni temu
Poznań, wielkopolskie, wielkopolskie, Polska Sii Sp. z o.o. Pełny etatML Engineer / Data Scientist (f/m/x)Miejsce pracy: PoznańTechnologie, których używamyWymaganePythonPandasNumPyMachine LearningPyTorchscikit-learnAWS/Azure/GCPMile widzianeRAGGenAIOpenCVLLMMLOpsHugging FaceTransformersKerasTensorFlowMLFlowO projekcieW związku z rozwojem Centrum Kompetencji Data & Analytics – wyspecjalizowanej jednostki skupiającej...
-
Cloud Data Engineer
2 tygodni temu
Poznań, wielkopolskie, wielkopolskie, Polska Sii Sp. z o.o. Pełny etatCloud Data EngineerMiejsce pracy: PoznańTechnologie, których używamyWymaganeMicrosoft AzureETLSQLPythonDatabricksSparkPySparkMile widzianeSnowflakeApache KafkaApache AirflowO projekcieChcesz rozwijać swoje kompetencje w technologiach chmurowych? Dołącz do naszej wyspecjalizowanej jednostki skupiającej ekspertów z dziedziny procesowania i analizy...
-
Azure Data Engineering Expert
3 tygodni temu
Poznań, wielkopolskie, wielkopolskie, Polska Capgemini Polska Pełny etatAzure Data Engineering ExpertMiejsce pracy: PoznańTechnologies we useExpectedPythonMicrosoft AzurePolarsPandasPysparkPowerBIOptionalDatabase knowledgeOperating systemWindowsLinuxAbout the projectYOUR TEAMInsights & Data delivers state-of-the-art Data solutions. Our expertise primarily lies in Cloud & Big Data engineering, where we develop robust systems...
-
ML Engineer
3 dni temu
Piła, wielkopolskie, wielkopolskie, Polska Sii Sp. z o.o. Pełny etatML Engineer / Data Scientist (f/m/x)Miejsce pracy: PiłaTechnologie, których używamyWymaganePythonPandasNumPyMachine LearningPyTorchscikit-learnAWS/Azure/GCPMile widzianeRAGGenAIOpenCVLLMMLOpsHugging FaceTransformersKerasTensorFlowMLFlowO projekcieW związku z rozwojem Centrum Kompetencji Data & Analytics – wyspecjalizowanej jednostki skupiającej ponad...
-
Cloud Data Engineer
2 tygodni temu
Piła, wielkopolskie, wielkopolskie, Polska Sii Sp. z o.o. Pełny etatCloud Data EngineerMiejsce pracy: PiłaTechnologie, których używamyWymaganeMicrosoft AzureETLSQLPythonDatabricksSparkPySparkMile widzianeSnowflakeApache KafkaApache AirflowO projekcieChcesz rozwijać swoje kompetencje w technologiach chmurowych? Dołącz do naszej wyspecjalizowanej jednostki skupiającej ekspertów z dziedziny procesowania i analizy danych...
-
Senior/Lead Software Engineer
2 tygodni temu
Poznań, wielkopolskie, wielkopolskie, Polska ADDEPTO Pełny etatSenior/Lead Software EngineerMiejsce pracy: PoznańTechnologies we useExpectedJavaOracleKafkaKubernetesWebLogic ServerDDDRESTfull APIMicrosoft AzureOperating systemWindowsmacOSAbout the projectWe are looking for a visionary and hands-on Senior/Lead Software Engineer to design and guide the implementation of robust, scalable, and secure software solutions....
-
Senior Databricks Engineer
4 tygodni temu
Poznań, wielkopolskie, Polska CRODU Pełny etat 30 zł - 240 złCześć! Dla naszego klienta z USA poszukujemy Azure DataBrick Engineerów. Prace dotyczą działań w obszarach m.in. migracji, zbierania danych i optymalizacji rozwiązań opartych na DataBricks. Klient posiada stałe zapotrzebowanie na specjalistów. Projekty, które prowadzą przeważnie są krótkoterminowe (ze sporym prawdopodobieństwem na...
-
Senior Java Banking Developer
3 tygodni temu
Poznań, wielkopolskie, wielkopolskie, Polska Capgemini Polska Pełny etatSenior Java Banking DeveloperMiejsce pracy: PoznańTechnologies we useExpectedJavaSpring FrameworkOptionalAWSMicrosoft AzureNoSQLBig Data / Data ScienceUNIXAbout the projectA dynamic project portfolio development characterizes our company. One sector we provide services for is Banking and Capital Markets. Therefore, we are looking for experienced Software...
Data Engineer
2 godzin temu
Data Engineer - IT Governance & ComplianceMiejsce pracy: PoznańTechnologies we useExpectedSQLPythonGoogle Cloud PlatformLookerTableauGitAbout the projectThe IT Compliance team is responsible for the compliance of the organization's IT systems, networks and infrastructure in accordance with the IT industry best practices and standards. Our main role is to provide leaders and managers with guidance and knowledge on IT Governance processes, backed up by data, reports, dashboards, KPIs, visualizations and business insights to support data-driven and decision-making. Daily tasks are based on strong collaboration with other members of the organization to develop and implement policies and procedures that will help the company achieve the expected level of maturity of the Technology area.Your responsibilitiesYou will have access to advanced tools and technologies in Data Engineer and Data Analysis and the data of various type in the environment where data-driven decision making is a basic ruleYou will analyze, prepare data, propose the directions of process development and present the results to decision-makersYou will develop data process flows, reports, dashboards and presentations of findings (functional/technical) and opportunities for improvement, at various levels across the organizationYou will designing, developing and maintaining data pipelines and ETL/ELT processes in GCP, including orchestration in Airflow (Cloud Composer)You will take responsibility for the collection, analysis, interpretation of technical KPIs metrics in Looker Studio, supporting data-driven decision-makingYou will support technical leaders in defining the parameters of the services provided and ways to optimize themYou will implement Data Mesh–aligned practices for domain ownership, data contracts and standardized data sharingOur requirementsLove data and are eager to find the best way to let them speakAre not afraid of large volumes of data and can connect dots with easeAre independent, precise, curious about the data and always verify if the result seems reasonable and is fully logicalHave 2+ years of experience as a Data Engineer or Data Analyst with strong engineering mindset (experience in Technology is appreciated)Have very good knowledge of SQL and practical experience with Python (preferred for automation, ETL, scripting)Have experience in using Google Cloud Platform (BigQuery, Cloud Storage, Cloud Functions, Cloud Composer/Airflow)Take care of the reproducibility and reusability (knowledge of GIT is appreciated)Have experience in data visualization in Looker Studio (knowledge of Tableau is appreciated)Want to develop both technical and business competencies in the IT Governance ecosystemKnow English on at least B2 levelOptionalKnowledge of at least one of the following areas: Service Level Management, FinOps, IT Governance & ComplianceWhat we offerFlexible working hours in the hybrid model (4/1) - working hours start between 7:00 a.m. and 10:00 a.m. We also have 30 days of occasional remote work.Annual bonus based on your annual performance and company results.Well-located offices (with e.g. fully equipped kitchens, bicycle parking, terraces full of greenery) and excellent work tools (e.g., raised desks, ergonomic chairs, interactive conference rooms).A 16" or 14" MacBook Pro or corresponding Dell with Windows (if you don't like Macs) and all the necessary accessories.A wide selection of fringe benefits in a cafeteria plan - you choose what you like (e.g., medical, sports or lunch packages, insurance, purchase vouchers).English classes that we pay for related to the specific nature of your job.A training budget, inter-team tourism (see more here), hackathons, and an internal learning platform where you will find multiple trainings.An additional day off for volunteering, which you can use alone, with a team, or with a larger group of people connected by a common goal.Social events for Allegro people - Spin Kilometers, Family Day, Fat Thursday, Advent of Code, and many other occasions we enjoy.Benefitssharing the costs of sports activitiesprivate medical caresharing the costs of foreign language classessharing the costs of professional training & courseslife insuranceflexible working timeintegration eventsno dress codeleisure zoneextra social benefits#goodtobehere means that:You will join a team you can count on - we work with top-class specialists who have knowledge- and experience-sharing in their DNA.You will love our level of autonomy in team organization, the space for continuous development, and the opportunity to try new things. You get to choose which technology solves the problem and you are responsible for what you create.You will value our Developer Experience and the full platform of tools and technologies that make creating software easier. We rely on an internal ecosystem based on self-service and widely used tools such as Kubernetes, Docker, Consul, GitHub, and GitHub Actions. Thanks to this, you can contribute to Allegro from your very first days on the job.You will be equipped with modern AI tools to automate repetitive tasks, allowing you to focus on developing new services and refining existing ones (also leveraging AI support).You will create solutions that will be used (and loved) by your friends, family and millions of our customers.You will meet the Allegro Scale, which starts with over 1000 microservices, an open-source data bus (Hermes) with 300K+ rps, a Service Mesh with 1M+ rps, tens of petabytes of data, and production-used machine learning.You will become part of Allegro Tech - We speak at industry conferences, cooperate with tech communities, run our own blog (it's been over 10 years), record podcasts, lead guilds, and we organize our own internal conference - the Allegro Tech Meeting. We create solutions we love (and can) to talk about