Careers

Join the team

At Next Pathway we have built a work environment based on 3 core principles:

  • Emphasize quality first, each and every time
  • Put people in roles where they will succeed and feel challenged
  • Build a team of well qualified individuals that can share ideas and learn from each other

Our environment rewards people for hard work, loyalty, innovation and mutual support. We aim to match people’s strengths, skills and talents to our requirements. Identifying this ideal match between attitude, skill and need, leads to success.

Here you will find a list of the current opportunities at Next Pathway. We have taken careful effort to identify the specific requirements that we need at this time. Kindly review and feel free to submit your resume for our consideration.

We do not accept applicants from employment or staffing agencies.

90 Second
Mission Statement

Company video cover
Play Video

Open Positions

Montreal, QC

Responsibilities

  • Participate in transformational consulting to assist clients in assessing their current state ecosystem, Infrastructure, services and data landscape
  • Collaborate and work closely with customer architecture group and Mphasis development team
  • Engineer Cloud Native solutions that can be instantiated with different technologies by applying scientific (or pseudo-scientific) principals in a pragmatic way
  • Advice on the change management strategies and programs that will ensure the services transformation
  • Assess service gap analysis and able to create data mapping and service catalog
  • Define integration and API strategy for the enterprise with optimal and cost effective integration patterns

Required Skills

  • Experience executing large scale Application transformation or application migration/modernization projects
  • In-depth hands on experience in either Java or .NET is a must, functional reactive programming experience is preferred
  • Experience in creating opti-channel services and understand channel based integrations
  • Strong experience in Integration technologies – API gateway, Integration Bus, Microservices
  • Strong experience in designing and building RESTful API integration on-prem and cloud
  • Deep and broad knowledge in multiple core technologies like messaging, routing, data processing, event streaming, security, system management etc.
  • Engineering expertise in building highly performant distributed systems for scalability, security and resiliency
  • Demonstrated competency in leveraging current technologies in the areas of containers, container orchestration and peer-to-peer computing
  • Experience in architecture design of applications based on Microservices and distributed computing. Very strong experience on messaging and RESTful based applications.
  • Set-up and configuration of Application monitoring and event sourcing tools
  • Experience in Architecture design of cloud native application and understanding of the implications of the 12 factors Apps

Toronto, ON

We are searching for a Big Data Platform Engineer to join our team.

The ideal Candidate will have:

  • Computer Science background
  • Minimum of 5 years of proven experience in a core competency
  • Knowledge of CI tools like Git, Maven, SBT, Jenkins, Artifactory/Nexus, JIRA, GIT, Confluence
  • Proven ability to write efficient, maintainable, well documented code
  • Adoption of Agile and Scrum development methodology
  • Effective in a team setting as well as in individual capacity
  • Develop and maintain unit tests and integration tests, and test automation
  • DevOps experience is a plus
  • Hadoop administration certification in Hortonworks or Cloudera
  • Experience with Hadoop administration, monitoring and software deployment tools (i.e., Ambari)
  • Experience with resource management utilizing YARN
  • Knowledge of full Hadoop stack and architecture
  • Performance tuning and troubleshooting
  • Data backup and replication
  • Data compaction
  • Security architectures, Kerberos, AD, Ranger
  • Unix system administration experience
  • Unix shell scripting
  • Experience with open source software installation and configuration
  • Experience administering of any is a plus: MongoDb, Kafka, Elasticsearch, Cassandra
  • Excellent Communication Skills

Montreal, QC

We are searching for a Senior Program Manager to join our team.

The Candidate must be able to run multiple engagements in an empowered manner, while making decisions that not only drive operational excellence but also revenue growth.

The ideal Candidate will have:

  • 5 years of Senior Technical Delivery Management experience
  • Development background (5+ years)
  • Experience running API/Digital Transformation projects
  • Experience managing platform modernization teams using API tools
  • Proven experience in delivering digital projects and products
  • Demonstrated experience in sprint planning and agile development best practices
  • Experience running multiple projects across different geographies
  • Experience managing on and offshore teams
  • Extensive experience crafting user stories
  • Proven experience balancing multiple priorities
  • Excellent organization skills
  • Experience in matrix-managing multi-disciplinary teams
  • Demonstrated experience in sprint planning and agile development best practices
  • Agile environment experience
  • Excellent communication skills

Toronto, ON

We are searching for a Senior Big Data Developer to join our team.

The ideal Candidate will have:

  • Computer Science background
  • Minimum of 5 years of proven experience in a core competency
  • Knowledge of CI tools like Git, Maven, SBT, Jenkins, Artifactory/Nexus, JIRA, GIT, Confluence
  • Proven ability to write efficient, maintainable, well documented code
  • Adoption of Agile and Scrum development methodology
  • Effective in a team setting as well as in individual capacity
  • Develop and maintain unit tests and integration tests, and test automation
  • DevOps experience is a plus
  • Data Lake engineering experience
  • Experience working with Hadoop, Spark, Spark SQL, Spark Streaming, Scala, Java, Avro, JSON, Parquet
  • Experience in software engineering and framework development
  • Experience working with distributed platforms (Linux)
  • High performance and scalable solutions engineering
  • Highly available application architecture design
  • Application performance tuning and troubleshooting
  • Design and develop key micro-services as components of Data Fabric
  • Experience with any is a plus: MongoDb, Dremio, Ignite, Solace, Kafka, Airflow, Docker, Kubernetes, ELK stack, Cassandra, Hbase, ETL technologies, Hive, Postgres, Dremio
  • Unix and shell scripting
  • Excellent communication skills

Toronto, ON

We are searching for a Full Stack Developer to join our team.

The ideal Candidate will have:

  • Computer Science background
  • Minimum of 5 years of proven experience in a core competency
  • Knowledge of CI tools like Git, Maven, SBT, Jenkins, Artifactory/Nexus, JIRA, GIT, Confluence
  • Proven ability to write efficient, maintainable, well documented code
  • Adoption of Agile and Scrum development methodology
  • Effective in a team setting as well as in individual capacity
  • Develop and maintain unit tests and integration tests, and test automation
  • DevOps experience is a plus
  • Solid experience in UI/UX design and development
  • Expert in Python, with knowledge of at least one Python web framework such as Django or Flask
  • Familiarity with some ORM (Object Relational Mapper) libraries
  • Understanding of the threading limitations of Python, and multi-process architecture
  • Familiarity with event-driven programming in Python
  • Experience in Micro-services and REST API development
  • Good understanding of server-side templating languages such as Jinja 2, Mako, etc.
  • Basic understanding of front-end technologies, such as JavaScript, HTML5, and CSS3
  • Understanding of accessibility and security compliance 
  • Knowledge of user authentication and authorization between multiple systems, servers, and environments
  • Understanding of fundamental design principles behind a scalable application
  • Understanding of the differences between multiple delivery platforms, such as mobile vs desktop, and optimizing output to match the specific platform
  • Able to create database schemas that represent and support business processes
  • Strong unit test and debugging skills
  • Experience with Postgres and MongoDB are a plus
  • Excellent Communication Skills

Toronto, ON

We are looking for a Senior Java Developer to join our core development team, and guide a group of very skilled developers with his engineering knowledge and skills.

The ideal Candidate will have:

  • 10+ years of strong Core Java Development experience
  • 5+ years of Java Spring
  • 10+ years of Data Warehousing
  • 5+ years of Big Data Technology
  • Knowledge with Compilers – good to have
  • AWS or Cloud – good to have
  • PhD or Masters in Computer Engineering

Montreal, QC

  • Must have 15+ years of experience in design and development, with 2+ years in API Design, Development, and Management.
  • Must have several years of experience in Integration /Distributed/Mobile /BigData/IoT Application Development experience.
  • Must possess good experience on API platforms and gateways, from at least one of the following: Apigee Platform (Edge, Micro), Mulesoft Anypoint, Azure API mgmt.
  • Must possess experience with IOT Gateway, Sense, and Istio
  • Must have experience with API gateway on-premise/ private cloud implementation
  • Must have experience designing API using OpenAPI (formerly SWAGER)
  • Must possess good working knowledge of API platform features including but not limited to security policies, monitoring, metering, monetization and developer portal.
  • Must be familiar with MBaaS and mobile gateway.
  • Must possess hands-on experience wth Kafka and optionally MQ
  • Must possess good experience in designing and developing distributed applications based on microservices architecture.
  • Must be well-versed in and have expert level experience on (RESTful) API design, API Lifecycle Mgmt, micro service architecture patterns, Event Driven Architecture patterns, Data Integration patterns
  • Must possess ability to suggest and implement architectural, scalability, performance and process improvements ideas
  • Must have good understanding of Continuous Improvement (CI) and Continuous Deployment processes.
  • Must beable to liaison with multiple stakeholders to help develop cloud adoption roadmaps
  • Exposure in any of the cloud solution architecture like GCP/AWS /Microsoft Azure will be an added benefit
  • Exposure to kubernetes and Dockers is a must have
  • Self-starter and experienced in leading the junior resources
  • Ability to collaborate and communicate effectively with all stakeholders like Developers, business analysts, and business Stakeholders.
  • Typically worked on following technologies but not limiting to: Java, J2EE, Spring Boot, Spring Cloud, Micro services architecture, Spring MVC, Spring Security, OAuth 2.0, Continuous Integration with Jenkins, Maven, Junit, Code quality analysis tools like Sonar, any scripting language
  • Experience working with multi-vendor, multi-culture, distributed offshore and onshore development teams in dynamic and complex environment
  • Must have excellent written and verbal communication skills

Montreal, QC

Qualifications

  • 6+ years of total IT experience including 2+ years of Big Data experience
  • Experience in Spark Streaming, Kafka, Spark SQL, HBase and Java are must
  • Experience in building real time data streaming pipelines from Kafka (or any message broker) using Spark Streaming
  • Hands on functional programming like Scala, Python or Java 8.
  • Proficient in Linux/Unix scripting.
  • Experience in Agile methodology is a must.
  • Knowledge of standard methodologies, concepts, best practices, and procedures within Amazon EMR Big Data environment
  • Self-starter and able to independently implement the solution.
  • Good problem-solving techniques and communication

Job Description

  • Hands on Big Data developer role (Spark streaming)
  • Actively participate in scrum calls, story points, estimates and own the development piece.
  • Analyze the user stories, understand the requirements and develop the code as per the design
  • Develop test cases, perform unit testing and integrating testing
  • Support QA Testing, UAT and production deployment
  • Develop batch and real-time data load jobs from a broad variety of data sources into Hadoop. And design ETL jobs to read data from Hadoop and pass to variety of consumers / destinations.
  • Perform analysis of vast data stores and uncover insights.
  • Analyze the long running queries and jobs, performance tune them by using query optimization techniques and Spark code optimization.

Montreal, QC

  • Determine and Understand the data integration requirements and NFRs
  • Define Integration Strategy and Architecture
  • Identify, define and Analyze Integration Solution Architecture
  • Architect and Design Integration for performance, scalability and availability based on industry standard patterns
  • Solution, Design and Develop Data Integration approach, patterns, methods, payload models, process and components
  • Perform benefit analysis of solution options and technology
  • Socialize Integration solution architecture with the stakeholders
  • Model and Design building blocks for Integration solution
  • Guide Developers perform review of work products and provide oversight
  • Define and review Standards, Architecture design principle and processes
  • Be accountable for the technical integrity of the project architecture/design
  • Provide hands-on technical direction to other developers to review code and resolve issues
  • Take part in reviewing, designing, and trouble-shooting project technical architecture, designs, common components, and code
  • Skills: Integration Architecture, implement messaging layers & message brokers, design and develop REST APIs, web services / micro services, event publish & subscribe patterns, XML / Jason handling
  • Tools, Technology & Framework: Java, Node.js, Python, REST/SOAP, JMS, AMQP, API gateways, IIB etc

Montreal, QC

  • Experience with Streaming Analytics technologies for job monitoring, configuration such as Azure Stream Analytics, Google Cloud Dataflow and Cloud Pub/Sub, Apache Storm, Apache Spark, Striim, IBM Infosphere Streams, Software AG Apama, Tibco Streambase, SAS Event Stream Processing
  • Experience Developing Streaming Monitoring jobs using Powershell, Visual Studio or .NET SDK
  • Ability to define appropriate metrics to ensure quality of service to Analytics teams
  • Develop and Manage the Infrastructure and Tools to enable Development, Verification, Validation and Delivery of Products into Dev, Test, Staging and Production Environments.
  • Architect an Environment that supports Continuous Delivery
  • Setting up Monitoring and Alerting Tools
  • Experience with Continuous Delivery and Continuous Deployment Pipelines with strict SLAs
  • Strong Development and Deployment experience of 3+ years
  • Experience with message queue technologies (eg. Kafka, MQ Series, Cloud Pub/Sub)
  • Experience using REST API’s
  • Experience with Hadoop and Cloud technology stacks and infrastructure (Hortonworks/Cloudera, AWS, Azure, GCP)
  • Experience with Cloud storage technologies and data ingestion, Event Hubs, IoT hubs, Blob storage
  • Experience with SQL Query Development
  • Excellent troubleshooting and analysis skills
  • Familiarity with Machine Learning, AI toolsets, Power BI, Data Science concepts
  • Experience with developing, configuring and monitoring CI/CD pipelines
  • Exceptional Communication Skills (Verbal and Written)
  • Ability to work independently and under pressure in a fast paced environment

Montreal, QC

Minimum 12 years of experience in design and development of Integration /Distributed/Mobile/BigData/IoT Application Development with at least 2 years as an Architect  creating solutions based on Kafka

  • Must have strong Programming experience in Core Java, Reactive service design, service registry
  • Must have strong experience in writing programs using Kafka APIs and Kafka Streams API
  • Must have a deep understanding and experience in Streaming technologies like Kafka, Spark, Flink etc.
  • Must have experience in handling huge volumes of streaming messages from Kafka or any other robust message broker
  • Must have hands on experience in standing up and administrating on-premise Kafka platform which includes creating a backup & mirroring of Kafka Cluster brokers, broker sizing, topic sizing, h/w sizing, performance monitoring, broker security, topic security, consumer/producer access management (ACL)
  • Must have experience managing Kafka clusters both on Windows and Linux environment for:
    1. Guaranteed delivery – exactly once
    2. Guaranteed Persistence
    3. Low latency
  • Must have a good understanding and experience of working in cloud/on-premise platforms
  • Must have exposure to Kubernetes and Docker
  • Must have exposure to developing applications using microservices architecture
  • Must have experience working with multi-vendor, multi-culture, distributed offshore and onshore development teams in dynamic and complex environment
  • Must have excellent written and verbal communication skills
  • Experience with Kafka Connect will be desirable

To join our team, please apply here!

Please submit your resume for consideration.
Please note that we will only contact those candidates meeting our requirements.