Job Purpose
Combined Ratio Solutions is an American company with multiple offices worldwide, engaged in the development of products for insurance companies. Our customers are large international companies, leaders in the insurance industry. Combined Ratio Solutions builds data products for large enterprise customers. We are looking for a Data Engineer to help modernize an existing Enterprise Data Warehouse on Azure and establish the end-to-end foundations for processing financial data: target architecture, environments (Dev/UAT/Prod), delivery processes, and the gold layer + dimensional data marts for reporting.
This is an excellent opportunity for someone who is looking to work and learn together with a highly motivated team. If you are motivated and results-driven, and enjoy working in a team environment, we’d like to meet you.
Key Duties & Responsibilities:
- Build and maintain ELT pipelines using Azure Data Factory, Synapse, Spark, or Microsoft Fabric.
- Develop and support dimensional data marts and gold-layer datasets for reporting.
- Implement incremental loading, backfills, and data validation checks.
- Troubleshoot pipeline and data issues, and support reliability improvements.
- Collaborate with analysts and Power BI teams to deliver reporting-ready datasets and views.
- Participate in monitoring, logging, and alerting for data workflows.
- Optimize SQL queries and transformations with guidance from senior team members.
- Contribute to documentation, development standards, and CI/CD processes.
- Follow data security and access control standards when working with sensitive information.
Skills & Qualifications:
- 2–4 years of experience in Data Engineering or a related role.
- Strong SQL (T-SQL preferred): window functions, incremental patterns, deduplication, performance basics.
- Solid understanding of data warehousing and dimensional modeling concepts.
- Hands-on experience with Azure data services such as Azure Data Factory, ADLS Gen2, Azure SQL, Synapse, Spark, or Microsoft Fabric.
- Experience building and maintaining ETL/ELT pipelines.
- Python for scripting, data transformation, or pipeline support.
- Experience using Git in collaborative development.
- Strong analytical thinking, attention to detail, and willingness to learn.
- Good communication skills and ability to work in a cross-functional team.
- English B2+.
Optional Requirements:
- Experience with Microsoft Fabric.
- Familiarity with Azure DevOps and CI/CD workflows.
- Basic knowledge of Terraform or Bicep.
- Experience with financial data or reconciliation processes.
- Exposure to Cosmos DB or other NoSQL technologies.
What We Offer:
- Work with our distributed high-performance team.
- Interesting, large-scale, and technically challenging projects.
- Modern technologies and opportunities for continuous professional growth.
- Competitive salary.
- Free choice of work format.
- Medical health insurance
- Refer-a-friend-to-work bonus.
Additional Note:
- Working hours - 11:00 AM - 8:00 PM CET.
- Working format - Full-time, Remote job.
If you like wild growth and working with happy, enthusiastic over-achievers, you'll enjoy your career with us!
Preporuke se učitavaju...