Integration Specialist - Sao Paulo, Brasil - DATASIDE
Descrição
Responsibilities:
- Conceptualize the solution for the business problem by implementing warehousing and programming techniques
- Build end to end products utilizing Azure Dev Ops
- Full responsibility for quality of code the team is responsible.
- Participate in the development and evangelization of the Python coding standards within the organization.
- Full responsibility for delivering solutions into production (working through operations teams).
- Responsible for training and mentoring developers on the team.
- Works with Technical Project Management to create and maintain the prioritized backlog and schedule for the team.
- Responsible for architectural decisions with consultation from other members of engineering leadership.
- Document every aspect of the project in standard way, for future purposes
- Storyboarding and presenting the insights to senior leadership
- Leading team of Data Engineers and should be able to guide
Mandatory Requirements:
- 812 years in programming (python) and data engineering and design.
- Proficient in Restful APIs with Strong Python programming experience.
- Experience with managing multiple Python virtual environments (pip, conda).
- Architect implement medium to large scale solutions (endtoend) on Azure using Azure Data Platform services (Azure Data Lake, Data Factory, Data bricks).
- Experience in developing data engineering solutions in Azure for Enterprise Data Management utilizing Spark, process orchestration, logging and auditing frameworks.
- Experience in development and implementation of ETL solutions using PySpark, Azure Data Factory and Azure data bricks.
- Experience working with large amounts of real data with SQL (HP Vertica, Synapse, Teradata, Oracle, or MySQL) and Python.
- Experience in Airflow
- Capable of building insightful visualizations in Power BI, Python (Good to have).
- Should have knowledge and experience in working with APIs (REST, SOAP).
- Capable of building/developing custom connectors to connect and retrieve information required for databases via. JDBC, ODBC and systems through open source APIs custom as may require.
- Experience with Azure cloud platform.
- Should have knowledge and experience in continuous integration and delivery (CI/CD) of Azure data factory pipelines in Azure DevOps, GitHub.
- Should have knowledge and experience in SDLC lifecycle, have worked in Agile environment.
- Propose architectures considering cost/spend in Azure and develop recommendations to rightsize data infrastructure.
- Should possess tech lead capabilities and manage other DEs and should be able to guide the team and face customer for any queries
IT skills required
- Languages (Python); good to have knowledge on Java.
- Azure (Azure Data Factory, Azure Databricks, Azure Functions, Logic App, Azure Data Lake, Azure Event hub, Delta Lake house, Azure SQL) Python, PySpark, Spark, Hive, Azure DevOps, Bash-Shell, Powershell, Datawarehouse, NoSQL (CosmosDB))
- Visualization tool ? Power BI
- GitHub, Azure DevOps, Git, pip/conda repositories
- Enterprise security patterns (Kerberos, SAML, OAUTH).
- Airflow
- Fluent English
Mais empregos da DATASIDE
-
Sdr - Sales Development Representative
São Paulo, Brasil - há 3 semanas
-
Copywriter
Sao Paulo, Brasil - há 6 dias
-
Analista De Parcerias
São Paulo, Brasil - há 2 semanas
-
Analista de Treinamento e Desenvolvimento
Sao Paulo, Brasil - há 1 semana
-
Analista de Requisitos Pl
Sao Paulo, Brasil - há 6 dias
-
Product Owner
Sao Paulo, Brasil - há 1 semana