Cloud Engineer Snowflake
4 tygodni temu
As a Cloud Engineer specializing in Snowflake, you will play a pivotal role within our Insights & Data team, contributing to the development of advanced data solutions. Your primary responsibilities will include designing, developing, and maintaining Snowflake data pipelines to support various business functions.
You will collaborate with cross-functional teams to understand data requirements and implement scalable solutions. Additionally, you will optimize data models and schemas for performance and efficiency, ensure data integrity, quality, and security throughout the data lifecycle, and implement monitoring and alerting systems to proactively identify and address issues.
You will also be tasked with planning and executing migrations from on-premises data warehouses to Snowflake, developing AI, ML, and Generative AI solutions, and staying updated on Snowflake best practices and emerging technologies to drive continuous improvement.
Key Responsibilities:- Designing and implementing Snowflake data pipelines
- Maintaining data quality and security
- Collaborating with cross-functional teams
- Developing AI, ML, and Generative AI solutions
We are a renowned company delivering state-of-the-art Data solutions, primarily focusing on Cloud & Big Data engineering. We develop robust systems capable of processing extensive and complex datasets, utilizing specialized Cloud Data services across platforms like AWS, Azure, and GCP.
Our expertise spans the entire Software Development Life Cycle (SDLC) of these solutions, with a strong emphasis on leveraging data processing tools, extensive programming, and the adoption of DevOps tools and best practices.
Furthermore, within our AI Center of Excellence, we undertake Data Science and Machine Learning projects, focusing on cutting-edge areas such as Generative AI, Natural Language Processing (NLP), Anomaly Detection, and Computer Vision.
Requirements:To be successful in this role, you will need:
- A minimum of 3 years of experience in Big Data or Cloud projects, specifically in processing and visualization of large and/or unstructured datasets, including at least 1 year of hands-on Snowflake experience.
- An understanding of Snowflake's pricing model and cost optimization strategies for managing resources efficiently.
- Experience in designing and implementing data transformation pipelines natively with Snowflake or Service Partners.
- Familiarity with Snowflake's security model.
- Practical knowledge of at least one Public Cloud platform in Storage, Compute (+Serverless), Networking, and DevOps, supported by commercial project work experience.
- At least basic knowledge of SQL and one of the programming languages: Python, Scala, Java, or bash.
- A proficient command of English.
We offer a competitive salary of approximately €80,000 - €110,000 per year, depending on experience, plus benefits including:
- A permanent employment contract from the first day.
- A hybrid, flexible working model.
- An equipment package for home office.
- Private medical care with Medicover.
- Life insurance.
- NAIS benefit platform.
- Access to 70+ training tracks with certification opportunities, and platforms with free access to Pluralsight, TED Talks, Coursera, Udemy Business, and SAP Learning HUB.
- Community Hub with over 20 professional communities focused on areas such as Salesforce, Java, Cloud, IoT, Agile, AI.