Descripción del Trabajo
Job Description
Datawarehouse/ETL Deeloper
- Minimum of 6 or more years working on a Datawarehouse and ETL development
- Experience designing, implementing and supporting highly scalable datawarehouse solutions
- Business Requirements Analysis
- Communication with key business users
- Hands on experience on development and support of Datawarehouses and interaction with Data Lakes
- Experience designing data solutions on Big data environments.
- Understanding of Cloud Solutions and Architecture
- Experience using Code/Version management tools.
- Some basic knowledge on Manufacturing process domain
Knowledge:
- Deep knowledge of Data Warehouse and Data Lake concepts, Big data, data architecture techniques and overall data warehouse strategies.
- Deep knowledge of datawarehouse dimensional modeling and best practices.
- Experience using MS TFS or Azure DevOps for Code management.
- Expertise on Snowflake
- Pipes
- Snowpipe
- SQS Notifications
- Stored Procedures
- Views
- JavaScript
- Experience with AWS Redshift
- Expertise on SQL language
- ETL/ELT development Tools experience, Talend is desired
- Cloud Solutions using AWS S3/Azure Blob
experience with Snowflake DW and/or Redshift