Future of X:
The Future of X (FoX) team is dedicated to transforming and creating a digital-first experience for support and services at Juniper. We are investing in a state-of-the-art digital technology stack, leveraging AI and automation to simplify customer journeys, enhance support experiences, and accelerate processes—often by enabling self-service capabilities. Key components of our solution include omnichannel platforms, portals, and digital suite of automation tools, providing customers with seamless, low-effort engagement options and enabling faster issue resolution, often without the need to open a case.
To build these complex solutions, our team consists of domain experts, architects, data scientists, data engineers, and MLOps professionals, managing the entire solution lifecycle—from concept to deployment. On the GenAI/AI front, we develop solutions tailored to Juniper’s business objectives.
Finally, every solution we bring to market is fully integrated into Juniper’s customer support and services technology stack, ensuring a seamless and impactful digital experience.
Responsibilities of Data Scientist within the FoX team:
- Develop advanced ML/AI models leveraging both structured and unstructured datasets for batch and online inferencing use cases within the Customer Support and Employee Experience domains.
- Leverage or build analytics tools that utilize the data pipeline to provide significant insights into customer case data, bug data, operational, and other key business performance metrics.
- Collaborate with partners including the executive, product, data, and operations teams to transform business priorities into ML/AI problems and develop solutions.
- Work with MLOps specialists to manage the full life cycle of model development from concept to production.
- Collaborate with data and analytics specialists to strive for greater functionality in our data systems.
- Identify trends and patterns from datasets to scope opportunities for automation.
Qualifications and Desired Experiences
- 7+ years of experience in an AI Engineer role, with a Graduate degree in Computer Science, Statistics, Informatics, Information Systems, or another quantitative field.
- 6+ years of experience in end-to-end architecting of advanced ML and AI solutions.
- Strong hands-on coding skills (preferably in Python) for processing large-scale data sets and developing machine learning models leveraging both structured and unstructured data.
- Experience working in a technical support environment, handling datasets from CRM, bug systems, and logs.
- Experience supporting and working with multi-functional teams in a multidimensional environment.
- Good team player with excellent interpersonal, written, verbal, and presentation skills.
- Create and maintain optimal data pipeline architecture, assembling large, sophisticated data sets that meet functional/non-functional business requirements.
- Experience working with Large Language Models, Generative AI, and Conversational AI.
- Familiarity with one or more machine learning or statistical modeling tools such as Numpy, ScikitLearn, MLlib, Tensorflow, and NLP libraries.
- Experience working with Databricks and Snowflake platforms.
- Experience with AWS, S3, Spark, Kafka, and Elastic Search.
- Experience with AWS cloud services: EC2, EMR, RDS, and Redshift.
- Experience with stream-processing systems: Storm, Spark-Streaming, etc.
- Experience with big data tools: Hadoop, Spark, Kafka, etc.
- Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.
- Familiarity with building and optimizing data pipelines, architectures, and data sets.
- Familiarity with MLOps practices and Agile development framework.
- Familiarity with CRM platforms such as Salesforce.
- Experience performing root cause analysis on internal and external data and processes to answer specific business questions and find opportunities for improvement.
- A successful history of manipulating, processing, and extracting value from large, disconnected datasets.
- Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.