ABOUT THE JOB
Bitwave Ltd. is a domestic IT company located in Belgrade. Remote and hybrid work models give us opportunity to employ engineers throughout Serbia as well as in the surrounding region.
The first pillar of our business are our clients and partners, medium and large international companies that need additional IT resources.
The second pillar are our software products which we are developing in partnerships with regional companies.
We are focused on building strong long-term relationships with all our partners, enabling their success, and build a positive work environment for our employees.
ABOUT THE PROJECT
We are expanding our team on a large-scale Data Lake platform used by one of the leading telecom operators. The role blends DevOps engineering, data pipeline operations, and platform automation across a modern Big Data environment.
You will work with Kubernetes-based distributed systems, high-volume data workloads, and advanced analytics tools that power mission-critical operations.
RESPONSIBILITIES
Platform & DevOps Engineering
- Deploy and manage Kubernetes clusters using Rancher
- Configure Kubernetes networking components (e.g., Calico)
- Automate deployments using Helm and GitLab CI/CD
- Provision infrastructure using Terraform
- Develop automation scripts, agents, or background daemons
- Build and support internal GUIs for control plane and monitoring
- Ensure secure, scalable, and highly available platform operations
- Maintain and optimize Ceph distributed storage
- Implement observability tooling (Prometheus, Grafana, ELK)
Data Engineering & Data Lake Operations
- Operate and support Big Data components: Spark, PySpark/Scala, Spark Streaming, Airflow, NiFi, Trino, Kafka
- Support Zeppelin and Jupyter notebook environments
- Maintain complex ingestion, transformation, and streaming pipelines
- Troubleshoot distributed processing and workflow orchestration
- Optimize compute and storage utilization across the platform
Security & Governance
- Integrate with Keycloak, RBAC, certificates, and secrets management
- Apply DevSecOps practices, cluster hardening, and secure deployments
- Validate the implementation of platform security mechanisms
Process & Collaboration
- Maintain documented workflows and processes using Confluence and Jira
- Work within Agile methodologies to ensure efficient planning, tracking, and delivery
REQUIREMENTS
Core Technical Skills
- Strong experience with: Kubernetes, Rancher, Helm, GitLab CI/CD, Terraform, GitOps workflows, Linux administration & Shell scripting
- Experience with Big Data technologies: Spark / Airflow / NiFi / Kafka / Trino / Zeppelin
- Understanding of Data Lake architecture and data ingestion patterns
- Python or Scala for automation and data processing
- Experience with Ceph or similar distributed storage systems
Additional Skills (Nice to Have)
- Development of internal tools, background agents, or daemons
- Building lightweight GUIs for automation or monitoring
- Experience with Jupyter
- Knowledge of HashiCorp Vault or Consul
- Experience with streaming workloads (Spark Streaming, Kafka Streams)
Soft Skills
- Analytical mindset and strong problem-solving abilities
- Clear, structured, and proactive communication
- Ability to work in complex, distributed enterprise environments
WHAT WE OFFER
- Competitive financial conditions
- Long-term stability and hybrid working model
- Work with cutting-edge Data Lake technologies
- Supportive and positive work environment
- Team-building activities and professional development opportunities.
HOW TO APPLY
If your qualifications match our open position, please apply via HR Lab.
Only short-listed candidates will be contacted.
We look forward to meeting you!
Preporuke se učitavaju...