Responsibilities:
• Architect and Design distributed data processing solutions on Google Cloud Platform, BigQuery and DataFlow
• Collaborate with product managers, data scientists, business users and other engineers to define requirements and design solutions.
• Influence the team by providing recommendations for continuous improvement.
• Mentor the Development team regarding technology and best practices on GCP
Key Qualifications:
• 10+ years of hands-on experience working with software development and enterprise data warehouse solutions.
• Hands-on experience developing a distributed data processing platform with Big Data technologies like Hadoop, Google BigQuery, DataFlow
• Experience with commercial ETL platforms like Informatica, SSIS, etc.
• Experience with Object Oriented Analysis and Design using Java, Python, etc.
• Experience working with, processing and managing large data sets (multi TB scale).
• Experience with test driven development and automated testing frameworks.
• Experience with Scrum/Agile development methodologies.
• Capable of delivering on multiple competing priorities with little supervision.
• Excellent verbal and written communication skills.
• Bachelor's Degree in computer science or equivalent experience.
Nice To Have:
• Experience with machine learning technologies on Google like Prediction API.
• Experience with Google Compute and App Engine