These are just some of the technologies that you might work with at Godel Data Division:
- Data Integration: Azure Data Factory (with SSIS)
- Data Processing: Azure Databricks
- Database & Data Warehouse: Azure SQL Database, Azure Synapse, SQL Server
- Storage: Azure Blob, Azure Data Lake Storage
You will be responsible for:
- Creating functional design specifications, Azure reference architectures, and assisting with other project deliverables as needed.
- Design and Develop Platform as a Service (PaaS) Solutions using different Azure Services.
- Create a data factory, orchestrate data processing activities in a data-driven workflow, monitor and manage the data factory, move, transform and analyze data.
- Design complex enterprise Data solutions that utilize Azure Data Factory. Creating migration plans to move legacy SSIS packages into Azure Data Factory.
- Build conceptual and logical data models.
- Design, build, and scale data pipelines across a variety of source systems and streams (internal, third-party, as well as cloud-based), distributed / elastic environments, and downstream applications and/or self-service solutions.
- Develop and document mechanisms for deployment, monitoring, and maintenance.
- Identifying performance bottlenecks, and accessing external data sources.
- Implementing security requirements.
- Monitoring and managing.
Ideally you have:
- Extensive knowledge of data architecture principles (e.g., data lake, data warehousing, etc.).
- Strong knowledge of relational databases, as well as skills in SQL query design and development.
- Experience working with the data in Python
- Conceptual understanding of cloud architectures, service tiers and hybrid deployment models.
- Ability to independently troubleshoot and performance tune large scale enterprise systems.
- Expertise with ETL/ELT design patterns, and DataMart structures (star, snowflake schemas, etc.).
- Experience with Microsoft Cloud Data Platform: Azure Data Lake, Azure Blob Storage, Azure Data Factory, Azure Cosmos DB, Azure Databricks, Synapse Analytics.
- Experience in Azure Data Factory (ADF) creating multiple pipelines and activities using Azure for full and incremental data loads into Azure Data Lake Store and Azure SQL DW.
- Hands-on experience with data migration methodologies and processes to transform on-premises data to cloud using tools like Azure Data Factory, Data Migration Service, SSIS, etc.
- Must have an ability to communicate clearly and be a team player.
- Intermediate level of English.
Nice to have:
- Hands-on programming experience in additional technologies, such as Python, Java, Scala , C#, Spark, Databricks, Powershell is a plus.
- Experience working with developer tools such as Azure DevOps and GitLabs.
- Power BI reporting tool experience.
- Understanding of big data tools such as Hadoop.
- Experience managing Microsoft Azure environments with VM's, VNETS, Subnets, NSG's, Resource Groups.
- Azure certification would be desired.
What we offer
- EMPLOYMENT CONTRACT OR B2B
- AGILE DELIVERY
- CHALLENGING PROJECTS
- PROFESSIONAL TEAM
- TRUST-FOCUSED CULTURE
- CORPORATE ACTIVITIES
- FREE ENGLISH CLASSES
- INVESTMENT INTO YOUR TRAINING
- FLEXIBLE WORKING SCHEDULE AND HYBRID WAY OF WORK
- FINANCIAL STABILITY