JPMorgan Chase Senior Infrastructure Developer - Apache NiFi & Software in Lewisville, Texas

Global Technology Infrastructure (GTI) is the technology infrastructure organization for the firm, delivering a wide range of products and services, and partnering with all lines of business to provide high quality service delivery, exceptional project execution and financially disciplined approaches and processes in the most cost effective manner. The objective of GTI is to balance business alignment and the centralized delivery of core products and services. GTI is designed to address the unique infrastructure needs of specific lines of business and the demand to leverage economies of scale across the firm.

The Core Foundation Services team (CFS) is responsible for providing end to end support for critical technologies that are used across the company. This includes Configuration and Orchestration, Identity Management, Name Services, Enterprise Monitoring Solutions, and automation tools used to manage these technologies. The BIFrost platform team within Core Services is seeking an infrastructure developer who will help implement a centralized, firm-wide data-mining solution for unstructured and semi-structured data.

The candidate will develop Hadoop and messaging business solutions leveraging micro-services and APIs across on-premise and off-premise environments. The candidate will also have platform onboarding responsibilities, and be required to integrate unified data services, frameworks, and user defined functions already in the Hadoop eco-system.

Candidate is responsible for providing solutions and development using Big Data technologies such as Kafka, Apache NiFi, Spark, etc. with Java/Scala. Candidate should be able to manage design, development and monitor to improve performance of Apache NiFi pipelines & Streaming solutions. We are looking for someone that shares these values and is excited to help us push our team to the next level with advanced skill to build data flows.


  • The Senior Software Developer will work as part of an Agile scrum team and will be responsible for analyzing, planning, designing and end-to-end Apache NIFI workflow development including testing & performance optimization.

  • Participate in NIFI Development, architecture and design discussions with the technical team and interface with other teams to create efficient and consistent solutions.

  • Develop scalable & robust Data Streaming & Processing solutions using Apache NIFI, Kafka & Big Data Platform.

  • Developing and managing current 30 Apache NIFI workflows

  • Excellent understanding of Agile/SDLC/ CI/CD processes and automated tools, spanning requirements/issue management, defect tracking, source control, build automation, test automation and release management.

  • Work within the global agile team to assure high quality, timely delivery and implementation of the projects and assure the overall success of the team.

  • Ability to collaborate and partner with high-performing diverse teams and individuals throughout the firm to accomplish common goals by developing meaningful relationships.


  • 8+ years of experience in developing complex Java & Microservices/API

  • Excellent communication skills and work as team player across multiple development tracks

  • 8+ years of experience in Core Java with excellent grasp on Networking, Threading, IO & core java APIs, OOPS concepts & implementations

  • Experience in building Big Data applications using Scala 2.x & Akka framework is good to have

  • 4+ years of sound experience working in Distributed Architectures like Lambda & Kappa

  • Solid Distributed fundamentals around CAP theorems, Filesystem & compression, BASE/ACID Data Stores, Resource Managers (YARN), Computational frameworks (e.g. Streaming/Batch/Interactive/Real time), Coordination services, Schedulers, Data Integration Frameworks (Messaging/Workflow/Metadata/Serialization), Data Analysis Tools & Operational Frameworks (Monitoring Benchmarking etc.)

  • 4+ years of experience working in Big data technologies such as Hadoop, Spark, Kafka, Hive, Hbase, Sqoop, other NoSQL solutions.

  • Experience in developing data pipelines, metadata management, data transformation using Spark, Kafka/Hadoop/NiFi

  • Good project experience working in HDFS, Hive, Spark, Yarn & Map reduce

  • Good project experience working in Full text search technologies like Elasticsearch and building reporting & Analytics platform

  • Good experience working in Linux environments, onsite/offshore model, Performance engineering and Tuning for above listed technologies

  • Good understanding of security frameworks & protocols like Kerberos, SSL, SASL, etc.

  • Ability to quickly learn and work on new cutting edge technologies

JPMorgan Chase is an equal opportunity and affirmative action employer Disability/Veteran.