phoenixNap is seeking a Data Analyst to join the enterprise architecture team. This person will support data analysis and reporting using SQL, Business Intelligence, and data mining tools, with data derived from relational databases, data warehouse, and NoSQL/BigData store. As the Data Analyst, you would work closely with the data architects, data engineers, and identified customers to ensure the technical and business requirements of the data architecture are met. Success in this position will be measured by the individual’s ability to document, design, and implement data analytics solutions and processes in an effort to create competitive advantages for the company’s cloud and payment processing products.
Key Job Responsibilities:
Analyze, visualize, and provide analytics on significant amounts of data and build end-to-end reporting solutions to support company data initiatives.
Use data mining techniques to reduce customer attrition and find new revenue opportunities
Build dynamic and rich dashboards using out-of-box features, customizations, and visualizations.
Support ETL job from a relational database to data warehouse
Identify actionable insights, suggest recommendations and influence the direction of the business by effectively communicating findings to cross-functional groups.
Explore and recommend emerging technologies and techniques to support/enhance BI landscape components.
Develop and maintain standards and principles for the enterprise data warehouse.
Decompose business requirements, identify and challenge assumptions, design and implement technical solutions.
Assist in project and capacity planning and support technical operations when needed.
Must be able to support customer-facing requirements gathering when needed.
Perform large-scale data analysis and develop effective statistical models for segmentation, classification, optimization, time series, etc.
Key Job Requirements:
Statistics and data analysis
Data mining techniques and tools (i.e. Bayesian analysis, SPSS, SAS, R, JMP, RapidMiner, NLP, Python, etc.)
Familiar with how to validate findings using an experimental and iterative approach, including clustering, regression, trend analysis, forecasting, modeling, and experiment design
2+ years of designing, implementing, and maintaining relational databases (MySQL, Oracle, SQL Server)
1+ years experience working with ETL tools (Pentaho, Informatica, SSIS)
Knowledge of Unix/Linux-based operating systems and experience in shell scripting required
1+ years working with data management solutions including Data Marts, Data Warehouses, and Operational Data Stores
Experience with BI systems (Microsoft, MicroStrategy, Cognos)
Experience with NoSQL/BigData (Cassandra, MongoDB, ElasticSearch)
MPP data warehouse system (Teradata, Netezza, Vertica)
Experience with Pentaho Data Integration and Toad Data Modeler
Experience with data mining and machine learning
Demonstrated ability to analyze and solve problems both independently and within cross-functional teams.
Ability to be self-motivated, a team player, and exhibit a high degree of professionalism
Bachelor or master’s degree in computer science/engineering or information systems preferred, not required
What we offer
Work in a world-renowned international company
Highly talented, professional, and friendly team
The ability to use cutting edge technologies
Constant support and coaching
Possibility for personal and professional growth
25 days of paid leave per year
5 days of fully paid sick leave per year
Private health insurance
Flexible working hours
Free soft drinks, fruit, tea, and coffee (when we work from the office)
Full remote until July 2022, after that 2 days WFH/week
This is an exciting opportunity to work with a highly innovative and creative team, in a great working environment using the latest technologies, methodologies, and frameworks.
email@example.com:~# Helloworld.rs koristi kolačiće kako bi ti pružao najbolje korisničko iskustvo. Nastavkom korišćenja
sajta smatraćemo da imamo saglasnost sa korišćenjem kolačića. Više o kolačićima možeš pročitati ovde.