Description:
The candidate will be part of the global Enterprise Monitoring team, which focuses on building the next generation of the Plant Monitoring tools. The candidate will be involved in the development of new system features and components of the real time data pipeline where Logs, events, metrics and reference data are streamed and processed in real-time. The features vary from integration with external data sources (like AWS) to real-time aggregation, alerting, transport to third party consumers and archiving. The role requires someone who is self-motivated, quick-learner and comfortable working across numerous technologies, and who can take ownership of critical problems and work throughout the full project lifecycle from problem analysis to successful timely delivery of the solution. The candidate should expect to work in a global virtual team, sometimes across multiple time zones. The ideal candidate is a self-motivated team player, committed to continuous delivery in an agile dev environment; ready to learn new technologies to apply them to real business needs.
Skills required:
Java or Python programming experience; streaming and messaging
Experience high-throughput, low-latency platforms for handling real-time data feeds like Apache Kafka
Excellent problem-solving, design, development, and debugging skills
Excellent written and verbal communication skills
Ability to rapidly learn new things
Bachelor’s, master’s, or Ph.D. in computer science, human-computer interaction, design, statistics, or a related field
Skills Desired:
Event processing frameworks experience, any of: spark/ kafka streams / storm/
Experience with AWS logging/monitoring tooling
Experience with data processing systems like Splunk/Hadoop/ElasticSearch
Experience with javascript: angular, kibana, grafana