Ideal candidate: Has 5+ years of commercial experience in implementing, developing, or maintaining data load systems (ETL/ELT). Demonstrates strong programming skills in Python, with a deep understanding of data-related challenges. Has hands-on
2025-11-28 10:25:14KMD PolandProficiency in AWS – experience with data processing, storage, and orchestration services (such as AWS Lambda, Redshift, Glue, S3) Advanced SQL and Python skills – building pipelines, data transformation and quality checks, automation, and integ
2025-11-25 10:25:15DevireTwój zakres obowiązków, Tworzenie i wdrażanie standardów danych oraz frameworków metadanych, aby dane mogły być wykorzystywane jako wspólny zasób organizacji., Projektowanie skalowalnych modeli danych oraz diagramów przepływów., Projektowanie,.
2025-12-02 15:41:55What are we looking for? At least 5 years of proven experience in data engineering roles A strong Data Engineering background in a data warehouse or data lake architecture Experience working in AWS/GCP cloud infrastructure. Experience developing
2025-12-02 10:25:14AUCTANE Poland Sp. z o.oTwój zakres obowiązków, Projektowanie architektury i tworzenie dokumentacji technicznej., Projektowanie procesów ELT., Wdrażanie kluczowych funkcjonalności platformy: RBAC, framework ELT itp., Opracowywanie strategii migracji., Projektowanie..
2025-12-02 09:41:33The first map of the labor market in the IT sector. We want to simplify the search process to minimum
2025-12-01 15:56:05Link GroupWarszawaYour responsibilities, Mentor and support a team of data engineers, while staying hands-on in pipeline development and architecture design., Build scalable, serverless, event-driven ETL pipelines to process and transform large datasets from..
2025-12-01 13:41:41Your responsibilities, Lead the design and evolution of our Medallion Architecture–based framework, ensuring scalability, governance, and best practices, Partner with other development teams and stakeholders to drive architecture decisions and.
2025-12-01 12:41:33Which technology & skills are important for us? Very good knowledge of data pipeline orchestration (design scalable, cloud-native data pipelines for data transformation and aggregation based on business use cases). Very good knowledge of GCP (o
2025-11-29 10:25:12Commerzbank 4,4Your responsibilities, Work on engaging projects with one of the largest banks in the world, on projects that will transform the financial services industry., Develop new and enhance existing financial and data solutions, having the opportunity
2025-11-28 14:41:46Your responsibilities, Define and lead the architecture of a greenfield data platform or data lakehouse., Design data models, integration patterns, and governance frameworks to ensure data quality, consistency, and accessibility., Collaborate wi
2025-11-28 12:41:43The client is one of the largest companies in the agribusiness and food production industry in the worldBasic information:Location: 1 day/month from WarsawRate: PLN 110-150/hour net + VATType of work: B2B contractDuration: initial order 3 months
2025-11-28 11:53:52Ideal candidate: Has 5+ years of commercial experience in implementing, developing, or maintaining data load systems (ETL/ELT). Demonstrates strong programming skills in Python, with a deep understanding of data-related challenges. Has hands-on
2025-11-28 10:25:14KMD PolandYour responsibilities, You will design and implement scalable, secure, and efficient data pipelines that can handle large amounts of data, You will build big data solutions using Azure services such as Azure Data Lake Storage, Azure Databricks,
2025-11-27 15:41:58Your responsibilities, Managing and leading projects across process, technology, and measurement development deliverables in the areas of metrology (optical/dimensional testing of optical components and systems), Designing, developing, and..
2025-11-27 13:41:26Your responsibilities, Develop innovative solutions for complex data challenges, using creativity and technology., Design and build data systems that fulfill client requirements and optimize user experience., Collaborate across teams to integrat
2025-11-27 12:41:31Your responsibilities, Develop innovative solutions for complex data challenges, using creativity and technology., Design and build data systems that fulfill client requirements and optimize user experience., Collaborate across teams to integrat
2025-11-27 12:41:31We are looking for a person for the position of Data Engineers/Big dataWhat we expectExperience Python, Pyspark, Pandas, JulyterLab (working with notebooks).Experience using AWS platform.Experience with continuous integration and continuous deli
2025-11-27 11:53:58Your responsibilities, Lead generation or scoring systems - you've built engines that identify sales opportunities from behavioral/firmographic data, B2B intent signals - experience working with job posting data, funding signals, company growth
2025-11-27 09:41:18Starszy Inżynier DanychR0019530_POL_rxr-1 Budujemy produkty danych, które napędzają kolejne pokolenie doświadczeń gamingowych — od analiz i modelowania zachowań graczy po wewnętrzną analitykę. Jako Senior Data Engineer będziesz odpowiedzialny za
2025-11-27 06:59:44AristocratYour responsibilities, Design and implement a greenfield data lakehouse architecture., Build and optimize data models, schemas, and integration patterns., Develop scalable data storage and processing solutions using Azure and open-source..
2025-11-26 15:41:29Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field. 3+ years of experience in data engineering or a similar role. Strong proficiency in Python and familiarity with frameworks such as Spark or Scala. Proven experie
2025-11-26 10:25:16Harvey Nash Technology 4,7Wymagania techniczne","technologies":"Technologie","technology":"Technologia","think_it_magazyn":"Think IT Magazyn HR","thinking_about":"Co masz na myśli?","time":"czasu","title":"Praca","title_a_z":"Tytuł oferty A-Z","title_z_a":"Tytuł ofe
2025-11-26 10:10:02AristocratYour responsibilities, Design & Develop Database Solutions: Architect, design, and implement highly optimized relational (e.g., MySQL, PostgreSQL, AWS Aurora, SQL Server) and NoSQL (e.g., MongoDB, DynamoDB, Redis) database schemas, ensuring data
2025-11-26 09:41:21Praca w 100% zdalnaUmowa B2B via Michael Page
2025-11-25 17:59:01Praca w 100% zdalnaUmowa B2B via Michael Page
2025-11-25 17:59:01Your responsibilities, Develop and maintain data infrastructure for analytics here at OEC., Build pipelines to extract data from primary and secondary sources., Build and support data visualization for Business and Product needs., Work with..
2025-11-25 15:41:23Your responsibilities, Design scalable data processing pipelines for streaming and batch processing using Big Data technologies like Databricks, Airflow and/or Dagster., Contribute to the development of CI/CD and MLOps processes., Develop..
2025-11-25 14:41:19Your responsibilities, Participates in industrialization and scaling up of successful advanced analytics & AI solutions prototypes into IT products., Develops generalized frameworks, components, patterns and libraries for repetitive data science
2025-11-25 13:41:22Min. 7 lat doświadczenia w projektowaniu i rozwoju platform DWH/Big Data. Bardzo dobra Snowflake Dobra znajomość Azure (ADLS, ADF, Key Vault, bezpieczeństwo) i SQL. Doświadczenie w projektowaniu procesów ETL/ELT i frameworków. Wiedza o kosztach
2025-11-25 10:25:16ITFS sp. z o. o5+ years of experience as a Data Engineer or ETL Developer, building large-scale data pipelines in a cloud environment (AWS, GCP, or Azure). Strong SQL expertise, including query optimization and data modeling. Hands-on experience with ETL/ELT t
2025-11-25 10:25:15Matrix Global ServicesProficiency in AWS – experience with data processing, storage, and orchestration services (such as AWS Lambda, Redshift, Glue, S3) Advanced SQL and Python skills – building pipelines, data transformation and quality checks, automation, and integ
2025-11-25 10:25:15DevireYour responsibilities, Develop System Verilog UVM testbenches and resolve complex test bench challenges, Define and implement a functional coverage model to ensure complete design verification, Develop a deep understanding of complex SoCs and IP
2025-11-24 13:41:24Your responsibilities, Manage customer’s problems through diagnosis, resolution, or implementation of new investigation tools to increase productivity for customer issues on AI/ML infrastructure., Develop an understanding of AI/ML workloads and
2025-11-24 11:41:37Wymagania techniczne","technologies":"Technologie","technology":"Technologia","think_it_magazyn":"Think IT Magazyn HR","thinking_about":"Co masz na myśli?","time":"czasu","title":"Praca","title_a_z":"Tytuł oferty A-Z","title_z_a":"Tytuł ofe
2025-11-24 10:10:00MoonPayYour responsibilities, Develop and maintain data delivery pipelines for a leading IT solution in the energy market, leveraging Apache Spark, Databricks, Delta Lake, and Python., Have end-to-end responsibility for the full lifecycle of features y
2025-11-24 00:40:20Your responsibilities, Design, develop, improve, and industrialize optical fiber manufacturing platforms (platform = process + equipment), Drive root-cause understanding of manufacturing limitations using scientific methods, Translate experiment
2025-11-23 15:41:07What we expect 5+ years of professional experience as a Data Engineer or Software Engineer in data-intensive environments Strong Python development skills, with solid understanding of OOP, modular design, and testing (unit/integration) Experienc
2025-11-23 10:25:12hubQuestYour responsibilities, Create means to transform data obtained from variable speed drives and production tests to easy to understand indicators and insights, Develop advanced algorithms and models to embedded and edge devices, Coach and support
2025-11-22 00:40:15Your responsibilities, Designing, developing, and maintaining data pipelines using Azure Databricks and Delta Lake, Implementing data applications in Python/PySpark for ingestion, historization, integration, and filtering, Developing data qualit
2025-11-21 15:41:49Twój zakres obowiązków, Projektowanie i rozwój infrastruktury sieciowej w środowiskach Data Center (architektura spine-leaf, fabric, EVPN, VXLAN), Konfiguracja oraz administracja sieci SAN i integracja z systemami storage czołowych producentów.
2025-11-21 11:41:44Your responsibilities, Leads cybersecurity incident response and investigations for moderate to high complexity events., Analyze large and complex technical data sets to identify abnormal user, network, and system activity warranting further..
2025-11-21 10:41:21Co wnosisz do zespołu 5 lat doświadczenia w obszarze analityki lub inżynierii danych Wykształcenie w zakresie informatyki, inżynierii lub pokrewnej dziedziny Masz solidną wiedzę w zakresie modelowania danych (np. schemat gwiazdy) oraz budowania
2025-11-21 10:25:13dmTECH5+ years of software development experience, strong in Python and SQL. Experience with web technologies (HTML, JavaScript, APIs) and Linux. Familiarity with web scraping tools (Selenium, Scrapy, Postman, XPath). Knowledge of containerization (Do
2025-11-21 10:25:13Link GroupWymagania techniczne","technologies":"Technologie","technology":"Technologia","think_it_magazyn":"Think IT Magazyn HR","thinking_about":"Co masz na myśli?","time":"czasu","title":"Praca","title_a_z":"Tytuł oferty A-Z","title_z_a":"Tytuł ofe
2025-11-21 10:10:08Emnify 5,0Wymagania techniczne","technologies":"Technologie","technology":"Technologia","think_it_magazyn":"Think IT Magazyn HR","thinking_about":"Co masz na myśli?","time":"czasu","title":"Praca","title_a_z":"Tytuł oferty A-Z","title_z_a":"Tytuł ofe
2025-11-21 10:10:08Luxoft DXCYour responsibilities, As part of our Customer Management & Core Finance/Accounting product team, this position will be in Poland. This may be dependent on the successful candidate’s background, experience, and proficiency. As a Software Analys
2025-11-21 09:42:19Your responsibilities, E2E delivery of assigned projects, such as developing performance KPIs, development of data models based on multiple data sources in alignment to standards and business requirements, including but not limited to data..
2025-11-20 16:41:01Your responsibilities, Design, develop, and optimize Databricks pipelines using PySpark and Spark SQL., Build and support Azure Observability components to ensure visibility and reliability across data ecosystems., Develop and maintain Power BI
2025-11-20 12:41:57REQUIREMENTS 5+ years of commercial experience as a Data Engineer Strong expertise in AWS (particularly within the stack mentioned above) Focus on data processing Upper-Intermediate or higher level of English ELEKS Software Engineering & Develop
2025-11-20 10:25:14EleksWhich technology & skills are important for us? Very good knowledge of data pipeline orchestration (design scalable, cloud-native data pipelines for data transformation and aggregation based on business use cases). Very good knowledge of GCP (o
2025-11-29 10:25:12Commerzbank5+ years of experience as a Data Engineer or ETL Developer, building large-scale data pipelines in a cloud environment (AWS, GCP, or Azure). Strong SQL expertise, including query optimization and data modeling. Hands-on experience with ETL/ELT t
2025-11-25 10:25:15Matrix Global Services