General Summary of Job Responsibilities
The Data Engineer will be responsible for partnering with various analytic teams and business partners throughout the enterprise to collect data requirements and design and build data pipelines/architecture as per IT best practices defined. Creating a data warehouse, design-related extraction, loading data functions, testing designs, data modeling, and ensure the smooth running of applications. Building analytic tools, assembling and cleaning BIG data, designing and building ETL using various technology languages, creating data visualization, assembling large data and identifying and improving internal process to automate manual data wrangling, optimizing performance, and improve data quality. Seek guidance from Senior/Principal Data Engineers and Data Architects as needed
Essential Duties and Responsibilities
- Capture/evaluate requirements from the data architect or business partner - consider development alternatives, and establish timelines
- Assemble and clean BIG data; data sets including various data sources to meet functional business requirements for small to large enhancements
- Work with Various Stakeholders/IT to design and build efficient data pipelines for small to large enhancements
- Create data visualizations for customer insight, operational efficiency and other key business performance metrics using various reporting tools
- Identify opportunities to improve internal process to automate manual data wrangling, optimizing performance, and improve data quality
- Prepare artifacts to support solutions as well as to document activities as part of a project
- Provide technical guidance for small to large enhancements in areas of solution alternatives, design, testing and documentation
- Provide incident management and direct technical consulting and support for current applications/solutions
- Other duties as assigned or may be necessary
Knowledge/Skills/Abilities
- Must have excellent verbal and written communication skills and be able to work with all levels of the organization
- Proficient in establishing and maintaining good working relationships (business and IT teams)
- Knowledge of project planning/full lifecycle delivery using Agile framework
- Understanding of data test methodologies and testing tools
- Ability to work effectively with contract employees and vendors
- Understanding of database management principles and methodologies, including data structures, data modeling, data warehousing, and transaction processing
- Knowledge of data design principles, methods, and approaches, applying systems engineering concepts such as: data structured design, supportability, survivability, reliability, scalability, and maintainability
- Knowledge of change and release tools and processes utilized to implement solutions across multiple teams and technologies
- Data modeling with conceptual data model, logical data model, and physical data model creation experience
- Knowledge and experience with DBT, Microsoft SQL Server, Databricks, and the Microsoft Azure cloud platform is particularly important for this role
- Design Kimball based dimension and fact tables based on business requirements (New designs must contribute to the overall team data model if applicable)
- Knowledge of project planning/full lifecycle delivery using Agile framework, preferably using ADO
- Understanding of data test methodologies and testing tools
- Understanding of database management principles and methodologies, including data structure, data modeling, data warehousing and transaction processing
- Proficiency in Python, SQL, and Kimball design concepts must be demonstrated
- Knowledge of data design principles (dimension/fact), methods and approaches
- Able to apply system engineering concepts such as: data structured design, supportability, survivability, reliability, scalability, and maintainability
Education/Experience
- Bachelor's degree in computer, engineering, data sciences or related field with two (2) or more years of software engineering experience. Beginner level experience with analytic tool build, data architecture/design, user requirements definition, build BIG data pipeline, understanding of ETL tool extraction, basic data testing aptitude, and understanding of analytic tool deployment processes and best practices.
- [OR] Associate's degree in computer, engineering, data sciences or related field with four (4) or more years of software engineering experience. Beginner level experience with analytic tool build, data architecture/design, user requirements definition, build BIG data pipeline, understanding of ETL tool extraction, basic data testing aptitude, and understanding of analytic tool deployment processes and best practices.
- [OR] High School Diploma/GED with six (6) or more years of software engineering experience. Beginner level experience with analytic tool build, data architecture/design, user requirements definition, build BIG data pipeline, understanding of ETL tool extraction, basic data testing aptitude, and understanding of analytic tool deployment processes and best practices.
- Create dimension and fact-based data models based on business requirements. Demonstrated data modeling/entity relationship diagramming capability is required with no exceptions. The ability to create performant SQL and Python based extraction, load and transform (ELT) process is required. Proof of this capability must be demonstrated including understanding of complex, multi-factor joining logic.
Preferred Experience
- Experience administering an existing legacy code base
- Experience with Python, SQL, ADF, DBT, and Databricks
Consumers Energy
1 Energy Plz
Jackson
Michigan United States
www.consumersenergy.com