YOUR PROFILE Hands-on experience designing and developing data pipelines with Databricks. Strong working knowledge of Delta Lake, Spark, and Databricks notebooks. Proficiency in SQL and Python for data transformation and analysis. Familiarity wi
2025-11-25 10:25:16Capgemini InventWhat You Bring 3+ years of hands-on experience with SAP BW / Data Engineering Strong skills in SAP BW, HANA, ABAP, SQL, BEx Excellent English and strong communication abilities A team-first mindset, natural curiosity, high motivation, and a posi
2025-11-26 10:25:16Integral SolutionsTwój zakres obowiązków, Projektowanie i budowanie architektury danych na potrzeby organizacji., Integracja danych z różnych źródeł, zarówno batch, jak i stream., Automatyzacja procesów ETL oraz orkiestracja przepływów danych., Wsparcie dla..
2025-12-04 14:41:27Your responsibilities, Lead an 8-person analytics team, managing day-to-day operations and fostering a high-performing, collaborative culture,, Drive data optimization projects that improve business processes and create measurable impact across
2025-12-04 14:41:27Twój zakres obowiązków, Projektowanie i implementacja procesów monitorowania oraz poprawy jakości danych (reguły DQ, testy porównawcze)., Tworzenie struktur i przepływów wspierających audyt i kontrolę danych., Walidacja zgodności danych na..
2025-12-04 12:41:28Your responsibilities, Conduct data profiling, validation, cleansing, and enrichment to ensure the highest data quality standards., Plan, develop, monitor, and maintain data migration workflows customized to meet diverse client requirements.,..
2025-12-04 09:41:29Your responsibilities, Collect, analyze, and understand the data requirements of our internal customers in all parts of the company, Advise our internal customers on the design of efficient data-driven processes, Create and maintain modular..
2025-12-04 00:40:13Your responsibilities, Creating scripts and tools in Python to automate business processes, Designing, implementing, and maintaining ETL processes for loading data into a PostgreSQL database, Optimizing and monitoring the performance of automati
2025-12-03 14:41:30Twój zakres obowiązków, Prowadzenie procesu integracji danych end-to-end: od zebrania i zrozumienia potrzeby biznesowej, przez analizę źródeł, aż po implementację i wpięcie danych w proces zasilania., Projektowanie, rozwój i optymalizacja proces
2025-12-03 14:41:30Your responsibilities, Understands requirements to build, enhance, or integrate programs and processes for Acxiom client applications. Able to read and interpret application design and functional specifications to write application code., Using
2025-12-03 12:41:45Your responsibilities, Architect and build scalable data systems, •Design and implement data pipelines and backend services connecting marketing, analytics, and CRM platforms using SQL, Python, and AWS technologies., , Develop and optimize MarTe
2025-12-03 12:41:45Your responsibilities, Design, develop, and maintain ETL/ELT pipelines and workflows using Databricks., Collaborate with cross-functional teams to understand data requirements and deliver high-quality datasets., Build and manage data lake and da
2025-12-03 10:41:28Your responsibilities, Design, develop, and maintain ETL processes and scalable data pipelines to enable data integrity and accessibility, Integrate data from diverse sources including APIs, message queues, and batch systems, Parse, transform, a
2025-12-03 10:41:28You’re a perfect match if you have: 5+ years of proven experience in data engineering Experience building data warehouses, preferably with Azure technologies Proven experience with DataBricks Knowledge how to create both normalized models and st
2025-12-03 10:25:15NewFire Global PartnersRequirements: Minimum 8 years of experience in Data Engineering At least 4 years of hands-on experience with Databricks, including services like data pipelines and Unity Catalog At least 2 years of experience working with Big Data technologies P
2025-12-03 10:25:15Upvanta2+ years’ experience as a Data Engineer (or similar) working heavily with PostgreSQL. Expert SQL and strong PL/pgSQL: able to design, implement, and maintain complex stored procedures/functions. Deep PostgreSQL fundamentals: schema design and no
2025-12-03 10:25:15capital.comWymagania techniczne","technologies":"Technologie","technology":"Technologia","think_it_magazyn":"Think IT Magazyn HR","thinking_about":"Co masz na myśli?","time":"czasu","title":"Praca","title_a_z":"Tytuł oferty A-Z","title_z_a":"Tytuł ofe
2025-12-03 10:10:01Cytiva 3,4Wymagania techniczne","technologies":"Technologie","technology":"Technologia","think_it_magazyn":"Think IT Magazyn HR","thinking_about":"Co masz na myśli?","time":"czasu","title":"Praca","title_a_z":"Tytuł oferty A-Z","title_z_a":"Tytuł ofe
2025-12-03 10:10:01Luxoft DXCYour responsibilities, Guarantee Data Reliability: Own DataOps implementation and CI/CD to ensure all production data pipelines are reliable, traceable, and scalable for AI initiatives., Optimize Data Velocity: Architect high-performance Airflow
2025-12-03 09:41:18Your responsibilities, Build scalable data processing pipelines., Identify potential improvements and enhancements for current data processing solutions., Advise on the utilization of appropriate tools and technologies., Research new tools and.
2025-12-02 17:41:09W Scalo zajmujemy się dostarczaniem projektów software'owych i wspieraniem naszych partnerów w rozwijaniu ich biznesu. Tworzymy oprogramowanie, które umożliwia ludziom dokonywanie zmian, działan
2025-12-02 16:30:02Twój zakres obowiązków, Tworzenie i wdrażanie standardów danych oraz frameworków metadanych, aby dane mogły być wykorzystywane jako wspólny zasób organizacji., Projektowanie skalowalnych modeli danych oraz diagramów przepływów., Projektowanie,.
2025-12-02 15:41:55Your responsibilities, Design and develop scalable data pipelines in Azure, Build and maintain Data Lake and Data Warehouse solutions, Collaborate with cross-functional teams on data modeling and analytics, Ensure data quality, consistency, and
2025-12-02 13:41:31Twój zakres obowiązków, zajmujesz się szeroko pojętą inżynierią danych w środowisku on-premise i rozwiązaniach chmurowych,, projektujesz nowe procesy przetwarzania danych w środowisku GCP,, wspierasz użytkowników w zakresie technicznych aspektów
2025-12-02 10:41:46Your responsibilities, Implementing, and optimizing modern cloud-based solutions., Building and launching new data models and data pipelines., Implementing best practices in data engineering including data integrity, quality, and documentation.
2025-12-02 10:41:46Educational Background: Bachelor’s or master’s degree in CS, Engineering, IT, or a related field. Very good knowledge of English and Polish. Programming Language: Strong programming skills in languages such as Python, Java or Scala. Cloud: Exper
2025-12-02 10:25:14Reply PolskaWhat are we looking for? At least 5 years of proven experience in data engineering roles A strong Data Engineering background in a data warehouse or data lake architecture Experience working in AWS/GCP cloud infrastructure. Experience developing
2025-12-02 10:25:14AUCTANE Poland Sp. z o.oWymagania techniczne: Bardzo dobra znajomość SQL Server (T-SQL, optymalizacja zapytań, procedury). Doświadczenie w projektowaniu hurtowni danych i kostek analitycznych (np. SSAS). Umiejętność tworzenia i optymalizacji raportów w SSRS (wydajność
2025-12-02 10:25:14Square One ResourcesTwój zakres obowiązków, Projektowanie architektury i tworzenie dokumentacji technicznej., Projektowanie procesów ELT., Wdrażanie kluczowych funkcjonalności platformy: RBAC, framework ELT itp., Opracowywanie strategii migracji., Projektowanie..
2025-12-02 09:41:33Your responsibilities, Collect, analyze, and understand the data requirements of our internal customers in all parts of the company, Advise our internal customers on the design of efficient data-driven processes, Create and maintain modular..
2025-12-02 08:40:37Wymagania • Jesteś studentem lub absolwentem kierunku technicznego lub analitycznego (informatyka, matematyka, fizyka, ekonomia); • Masz dostępność
2025-12-02 08:24:02Capgemini polska sp. z o.oDołączysz do zespołu, który pracuje nad najbardziej zaawansowanymi inicjatywami Big Data w bankowości. Jesteśmy 12-osobowym zespołem Scrumowym, realizującym zarówno projekty biznesowe, jak i R &D. Odpowiadamy m.in. za:•budowę i rozwój rozwiązań
2025-12-02 06:59:51GOLDENORE ITC sp. z o.oYour responsibilities, Mentor and support a team of data engineers, while staying hands-on in pipeline development and architecture design., Build scalable, serverless, event-driven ETL pipelines to process and transform large datasets from..
2025-12-01 13:41:41Your responsibilities, Lead the design and evolution of our Medallion Architecture–based framework, ensuring scalability, governance, and best practices, Partner with other development teams and stakeholders to drive architecture decisions and.
2025-12-01 12:41:33Must-Have Qualifications At least 3+ years of experience in big data engineering. Proficiency in Scala and experience with Apache Spark. Strong understanding of distributed data processing and frameworks like Hadoop. Experience with message brok
2025-12-01 10:25:13Link GroupMust-Have Qualifications At least 3+ years of experience in data engineering. Strong expertise in one or more cloud platforms: AWS, GCP, or Azure. Proficiency in programming languages like Python, SQL, or Java/Scala. Hands-on experience with big
2025-12-01 10:25:13Link GroupYour responsibilities, You’ll work with cutting-edge data platforms, design and maintain large-scale data pipelines, and collaborate with cross-functional teams to deliver high-quality data solutions...
2025-12-01 09:41:24Twój zakres obowiązków, Projektowanie, rozwijanie i utrzymywanie danych z wykorzystaniem Azure Databricks i Delta Lake., Migracja tradycyjnych rozwiązań ETL (np. Informatica) do nowoczesnej architektury opartej na chmurze Microsoft Azure.,..
2025-12-01 00:40:12Your responsibilities, conducting technical implementation in Celonis, including data transformation and dashboard application development, translating business needs into technical specifications and develop customer-specific process mining..
2025-11-30 13:40:47WHO YOU ARE: Required: Bachelor’s degree in Computer Science, Data Engineering, Information Technology, or a related field Proven experience in data visualization tools, specifically with Tableau, QV and Power BI. Strong analytical skills with a
2025-11-29 10:25:12Bayer 5,0Which technology & skills are important for us? Very good knowledge of data pipeline orchestration (design scalable, cloud-native data pipelines for data transformation and aggregation based on business use cases). Very good knowledge of GCP (o
2025-11-29 10:25:12Commerzbank 4,4Wroclaw Integration Developer / Data Engineer (part time)
2025-11-29 08:56:01ATOSWymagania techniczne","technologies":"Technologie","technology":"Technologia","think_it_magazyn":"Think IT Magazyn HR","thinking_about":"Co masz na myśli?","time":"czasu","title":"Praca","title_a_z":"Tytuł oferty A-Z","title_z_a":"Tytuł ofe
2025-11-28 17:10:01ACCENTURE (Polska)Your responsibilities, Manage and maintain large-scale Big Data platforms., Optimize system performance and ensure data availability and reliability., Guarantee the quality, integrity, and security of data., Collaborate with development teams to
2025-11-28 15:41:53Your responsibilities, Work on engaging projects with one of the largest banks in the world, on projects that will transform the financial services industry., Develop new and enhance existing financial and data solutions, having the opportunity
2025-11-28 14:41:46W Scalo zajmujemy się dostarczaniem projektów software"owych i wspieraniem naszych partnerów w rozwijaniu ich biznesu. Tworzymy oprogramowanie, które umożliwia ludziom dokonywanie zmian, działan
2025-11-28 12:35:07Scalo Sp. z o.o 5,0The client is one of the largest companies in the agribusiness and food production industry in the worldBasic information:Location: 1 day/month from WarsawRate: PLN 110-150/hour net + VATType of work: B2B contractDuration: initial order 3 months
2025-11-28 11:53:52Ta oferta jest dla Ciebie, jeśli: posiadasz minimum 3 lata doświadczenia jako Data Engineer, bardzo dobrze znasz Python (w tym JupyterLab, PySpark), swobodnie pracujesz w środowisku AWS, masz bardzo dobrą znajomość SQL, znasz narzędzia BI (Power
2025-11-28 10:25:14ScaloIdeal candidate: Has 3+ years of commercial experience in implementing, developing, or maintaining data load systems (ETL/ELT). Is proficient in Python, with a solid understanding of data processing challenges. Has experience working with Apache
2025-11-28 10:25:14KMD PolandIdeal candidate: Has 5+ years of commercial experience in implementing, developing, or maintaining data load systems (ETL/ELT). Demonstrates strong programming skills in Python, with a deep understanding of data-related challenges. Has hands-on
2025-11-28 10:25:14KMD PolandWroclaw Integration Developer / Data Engineer (part time)
2025-11-28 08:56:01ATOSWhat you need to have to succeed in this role Strong knowledge of Java and proficiency in Python or Go. Solid understanding of the software development lifecycle (SDLC). Experience designing and building complex distributed systems. Familiarity
2025-11-26 10:25:16HSBC Technology PolandBachelor’s or Master’s degree in Computer Science, Engineering, or a related field. 3+ years of experience in data engineering or a similar role. Strong proficiency in Python and familiarity with frameworks such as Spark or Scala. Proven experie
2025-11-26 10:25:16Harvey Nash Technology