Description
Description
- Aid the team in delivering continual data feeds to users and monitoring the status of the health of data quality and overall data ingest
- Visualize, interpret, and report data findings in dynamic reports
- Employ a variety of data manipulation and visualization tools to effectively convey current status and historical trends to leadership, users, and data team
- Collaborate with platform, software, and other data engineers to (re)configure data ingestion pipelines to be more reliable
- Comfortable working with data in a variety of formats including Excel, CSV, JSON, and XML
- Support the incident management process to ensure that incidents documented and resolved quickly. Perform root cause analysis to understand and prevent repeated occurrences of data outages
- Develop and maintain software to automate monitoring of real-time feeds and alert for timeliness, volume, lineage, and distribution data issues. Process learnings and rely on historical data from data pipelines, translating them into actionable steps to improve data ingest.
- Support the design, development, testing and implementation of web-based collaboration tools & platforms for data reporting.
- Demonstrate proficiency with frequent-used scripting language such as Python (primary) or bash commonly used in data science applications and data analytics.
Qualifications
- Bachelor of Science required in the following preferred fields: Computer Science, Mathematics, EE, Physics, Information Systems, or Information Technology
- Minimum five (5) years of data analysis/engineering
Tools
- Apps/Platforms: Docker/Podman, Kubernetes, Helm, NiFi, Kafka, Grafana, Prometheus, InfluxDB, PowerBI
- Operating Systems: Windows, Linux (RedHat)
Apply on company website