DescriptionWe have an exciting and rewarding opportunity for you to take your software engineering career to the next level.
As a Software Engineer III at JPMorgan Chase within the Enterprise Technology's Core Data Platform groups, you serve as a seasoned member of an agile team to design and deliver trusted market-leading technology products in a secure, stable, and scalable way. You are responsible for carrying out critical technology solutions across multiple technical areas within various business functions in support of the firm’s business objectives.
Job responsibilities
- Executes software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems
- Creates secure and high-quality production code and maintains algorithms that run synchronously with appropriate systems
- Produces architecture and design artifacts for complex applications while being accountable for ensuring design constraints are met by software code development
- Gathers, analyzes, synthesizes, and develops visualizations and reporting from large, diverse data sets in service of continuous improvement of software applications and systems
- Proactively identifies hidden problems and patterns in data and uses these insights to drive improvements to coding hygiene and system architecture
- Contributes to software engineering communities of practice and events that explore new and emerging technologies
- Adds to team culture of diversity, equity, inclusion, and respect
Required qualifications, capabilities, and skills
- Formal training or certification on data lifecycle concepts and 3+ years applied experience
- Experience across the data lifecycle
- Experience with SQL and understanding of NoSQL databases and their niche in the marketplace
- Obtain formal training or certification on software engineering concepts and 3+ years applied experience
Preferred qualifications, capabilities, and skills
- Hands-on experience with Apache Kafka and/or Spark. Experience in using Databricks for big data analytics and processing.
- Experience with Data Orchestrator tool like Airflow.
- Familiarity with data governance and metadata management.
- Hands on experience in building applications using AWS services like Lambda, EMR and EKS, REST API
- Solid background with hands-on Java, Spring Framework. Advanced at SQL (e.g., joins and aggregations).
- Experience in developing, debugging, and maintaining code in a large corporate environment with one or more modern programming languages and database querying languages