Provide the development and automation of computing processes on premise and in the cloud environment using cloud base architecture to detect, predict and respond to opportunities in business operations. Working with a variety of disparate datasets that encompass many disciplines and business units including weather, transmission and distribution grid infrastructure, power generation, gas delivery, commercial market operations, safety and security and customer engagement. Strive to transform and implement true business integration, leveraging top-notch data integration best practices. Merging and securing data in a way that reduces the cost to maintain and increases the utilization of enterprise-wide data as an asset. Developing business intelligence.
Grade: 14
Qualifications may warrant placement in a different job level.
Deadline to apply: Open until filled
Tasks and Responsibilities
-
Design, Develop, and unit test new or existing ETL/Data Integration solutions to meet business requirements.
-
Daily production support for Enterprise Data Warehouse including jobs in Alteryx, Hadoop and Oracle PL/SQL; and be flexible to manage high severity incidents/problem resolution.
-
Develop data integration and ETL/ELT workflows in the cloud environment using Cloud base architecture (Azure).
-
Participate in troubleshooting and resolving data integration issues such as data quality.
-
Deliver increased productivity and effectiveness through rapid delivery of high-quality applications.
-
Provide work estimates and communicate status of assignments.
-
Assist in QA efforts on tasks by providing input for test cases and supporting test case execution.
-
Analyze transaction errors, troubleshoot issues in the software, develop bug-fixes, involved in performance tuning efforts.
-
Develop Alteryx workflows and complex Oracle PL/SQL programs for the Data Warehouse.
-
Responsible for selecting and using DevOps tools for continuous integration, builds, and monitoring of solutions.
-
May provide input to area budget.
-
Makes some independent decisions and recommendations which affect the section, department and/or division.
-
Performs other duties as assigned.
Minimum Skills
Minimum Knowledge and Abilities
- Experience in a data integration role.
- Experience using Apache Spark, Nifi and/or Kafka.
- Experience using Python.
- Experience integrating enterprise software using ETL modules.
- Knowledge of data architecture, structures and principles with the ability to critique data and system designs.
- Ability to design, create and/or modify data processes that meet key timelines while conforming to predefined specifications utilizing the Informatica and/or Mulesoft platform.
- Understanding of big data technologies and platforms (Hadoop, Spark, MapReduce, Hive, HBase, MongoDB).
- Ability to integrate data from Web services in XML, JSON, flat file format, SOAP.
- Knowledge of core concepts of RESTful API Modeling Language (RAML 1.0) and designing with MuleSoft solutions.
Preferred Qualifications
- Relevant Certifications
- Experience in API Management
- Proficiency with the following databases/technologies: Mulesoft Anypoint Studio, Informatica PowerCenter, Oracle RDMS, PL/SQL, MySQL
- Knowledge of Test Driven Development (TDD)
- Familiarity with Cloud base architecture
- Experience with data analysis & model prototyping using Spark/Python/SQL and common data science tools & libraries (e.g. NumPy, Pandas, scikit-learn, TensorFlow)
- Experience in a technology organization
CPS Energy
145 Navarro
San Antonio
Texas United States
www.cpsenergy.com