About us:
We’re visionary innovators who provide trading and workflow automation solutions, high-value analytics, and strategic consulting to corporations, financial institutions, central banks, and governments. More than 40% of the world’s largest companies use our solutions. We’ve achieved tremendous growth by bringing together some of the best and most successful financial technology companies in the world.
At ION, we offer careers that provide many opportunities: To invent. To design. To collaborate. To build. To transform businesses and empower people around the world to do more, faster and better than before. Imagine what you can do and experience. This is where you can do your best work.
Learn more at iongroup.com.
Your role:
Your duties and responsibilities
• Developing, building, testing and maintaining data pipeline architectures
• Identifying solutions to improve data reliability, efficiency and quality
• Preparing data for predictive and prescriptive modelling
• Ensuring compliance with data governance and security policies
• Managing principles of big data and data warehouse platforms
• Creating programs and systems capable of acquiring, aggregating, transforming and structuring company's data
• Creating solutions that enable the work of data analysts and data scientists by using the tools, languages and structures of data engineering
• Integrating, consolidating, and cleaning data for application analysis.
Other duties
We might ask you to perform other tasks and duties as your role expands.
Your skills, experience, and qualifications required
• Master's degree with honors in technical/scientific areas (computer science, mathematics, physics, computer engineering)
• Knowledge of and ability to use programming languages (e.g. Python, Java, Javascript and R)
• Applied skills in relational databases (e.g. SQL Server and Oracle)
• Expertise in programming and technical analysis
• Knowledge of distributed systems (Hadoop)
• At least five years of experience in big data
• Excellent knowledge of the English language
• Excellent interpersonal and communication skills
Considered a plus: knowledge of machine learning (such as PyTorch, TensorFlow or similar), ETL tools and APIs for creating and managing data integration processes, Data processing (Apache Kafka), Distributed systems (Cloudera), BI tools (RStudio, PowerBI, Jupyter).
What we offer:
Permanent employment contract.
Location:
Assago (MI)
Important notes:
According to the Italian Law (L.68/99) please note that candidates from the disability list will be given priority.
Get notified for similar jobs
Sign up to receive job alerts