[Webinar] Build Your GenAI Stack with Confluent and AWS | Register Now

Real-Time Field Service Optimization

Verfasst von

It's estimated that by 2024, global spending on telecom services will exceed $1.5 trillion. As the critical enabler of growth and innovation across many industries, telecommunication service providers are adopting a cloud-first approach to accelerate time to market, integrate with leading technologies like 5G, and transform their infrastructure and business models to deliver innovative products and customer service. A key part of this is servicing millions of customers around the world in a timely and efficient way. To provide the best customer service while transitioning to real-time backend operations, telco organizations need to have the most accurate and up-to-the-minute customer and field service data in order to smoothly complete jobs as planned.

→ Webinar: Real-Time Data Streaming for Telecom

Today, telco companies rely heavily on third-party service providers to handle the installation, configuration, and maintenance of home optical network terminal (ONT) routers for their customers. This field service approach presents certain challenges. First, many customers are often unaware of this practice, mistakenly thinking that the work is executed by the telco company itself. This scenario raises several issues that impact the business when the service does not meet customer expectations: 

1. Customer misperception: Customers may associate any technical issues or bad performance to the telco, even though the root cause might lie with the third-party service provider itself.

2. Accountability and responsibility: Third-party companies handling these installations and repairs must assume the responsibility of doing the job correctly and efficiently. Their actions directly impact the customer experience and the telco's reputation. A mistake can lead to customer dissatisfaction, increased support calls, and churn, which have a direct impact on loss of revenue.

3. Quality control: Telcos must implement quality control measures to ensure that third-party service providers are aligned to standards and guidelines. These measures include training, periodic audits, and performance evaluations to maintain service quality and customer satisfaction. 

4. Impact on brand image: Any service provider that represents the telco company in the eyes of the customer influences the overall brand image of the telco. Poorly executed installations can damage the telco's reputation.

5. Risk mitigation: In response to these concerns, telcos should implement risk mitigation strategies. These might include comprehensive contracts with service providers, performance-based incentives, and regular customer feedback channels to identify and address issues promptly.

A largely manual process of sharing data with third-party service providers

1. Manual workflow: The use of a manual script to collect and manage orders from various files is inefficient, time-consuming, and unmaintainable. This manual process is slow as well as error prone, as it relies on human operators to handle tasks such as order assignment.

2. Non-real-time processing: The existing system—based on file exchanges using SFTP, homemade Linux scripts, and Python APIs—lacks real-time capabilities. This means that orders are not processed as they arrive, leading to delays in task assignments and impacted customer service. Nowadays real-time processing is essential and critical to efficiency and responsiveness.

3. Task assignment complexity: Assigning tasks via CRM to different third-party companies based on historical data and subjective scores or perceptions introduces subjectivity into the process. It can lead to incorrect assignments, as it doesn't account for real-time factors, such as the current workload or location of the third-party companies’ workforce.

4. Error-prone practices: Relying on manual processes for task assignments, especially in complex scenarios involving multiple third-party companies, increases the likelihood of errors. These errors can result in customer dissatisfaction, operational inefficiencies, and potentially financial losses.

5. Lack of scalability: The present architecture may struggle to scale with an increasing volume of orders, at the end the script is not event driven or containerized. As more orders appear and are processed, the manual nature of the workflow becomes increasingly slow, leading to bottlenecks and decreased operational agility.

Data streaming enables real-time, end-to-end communications for optimizing field service

There is a telco company in Spain that has implemented a future-proof solution with Confluent Cloud where their source CRM system has been modified to support real-time events (generated by microservices in an event data architecture), feeding them into a dynamic "Topic-as-a-Service” architecture and an API gateway for synchronous and asynchronous communications. These events are automatically routed to the third-party companies, providing immediate updates on the status of the service tasks. 

→ The Ultimate Guide to Event-Driven Microservices Architecture

Streaming with an event-driven microservice architecture enables: 

1. Enhanced communication: Confluent Cloud eliminates communication gaps between the legacy system (previously based on file transfer and Linux scripting that has since been updated to microservices) and the third-party companies. Real-time event routing ensures that all stakeholders are informed in real time, facilitating better collaboration between the telco and third-party companies.

2. Process optimization: By implementing an event-driven data architecture in which downstream microservices consume new messages as they’re generated, the organization has eliminated manual task assignment and significantly improved the overall workflow for their teams. This results in increased efficiency, faster response times, and streamlined operations.

3. Scalability and flexibility: The system is now well-prepared to add new third-party collaborative companies in a flexible way, with zero disruption to existing workloads. It can easily adapt to evolving business needs and partnerships.

Key components:

1. Confluent Cloud: The backbone of this new solution is Confluent Cloud, a cloud-native, complete data streaming platform that is available everywhere the business needs it to be—spanning on-prem to cloud and across clouds. 

2. HTTP sink connectors: These connectors play an important role in notifying third-party companies. Using HTTP sink connectors standardizes communication, as most interactions go through documented APIs. This approach simplifies the exchange of information.

3. Stream processing: Data can be enriched in real time using stream processing tools such as Flink. Stream processing patterns enable filtering, projections, joins, aggregations, materialized views, and other data enrichment to provide contextualized data for downstream consumers.

4. Stream Governance: Schema Registry ensures data quality and compatibility across producers and consumers, while also allowing for them to be upgraded with new data formats, preserving existing contract agreements until all communicating services have been upgraded. Stream Governance provides visibility into data lineage, enhances collaboration across teams, and supports more efficient microservice development. 

5. Terraform: Using Terraform for resource provisioning in the cloud is both efficient and cost-effective. It ensures that resources are created as needed, eliminating unnecessary expenses and optimizing cloud infrastructure.

In this new event-driven architecture, the telco has microservices that produce new tasks (i.e., new customer service orders). Written to Apache Kafka® topics in Confluent Cloud, these can be processed in real time while adhering to a consistent schema. The subsequent messages are streamed through HTTP sink connectors, which communicate the latest information with downstream third-party microservices that are responsible for end-to-end field service management. These include scheduling, dispatching workers based on location and availability, updating the status of customer requests, and more.

Business outcomes

Since the implementation of Confluent Cloud, the telco organization has experienced numerous beneficial outcomes, including:

1. Incident-free operations: Leveraging Confluent Cloud as the streaming backbone in their data architecture, their new system has operated smoothly without any incidents since its production deployment. This marks a significant improvement in reliability and stability.

2. Improved collaboration: Real-time event updates have led to better collaboration between the organization and third-party companies. Everyone is on the same page regarding task status, reducing confusion and enhancing productivity.

3. Enhanced efficiency: Using this innovative solution has not only mitigated the communication issues between the organization and its third-party collaborators but has also helped in operational efficiency. Real-time event routing ensures that all stakeholders remain consistently informed, leading to improved collaboration and faster response times. This impacted customer satisfaction positively, improving the NPS (Net Promoter Score). 

4. Scalability and adaptability: The cloud-native features of Confluent Cloud helps ensure cost-effectiveness—the telco organization only pays for necessary resources and can seamlessly scale as it adds new third-party collaborators. The system is now flexible and capable of quickly adding new microservices. As business needs evolve, the organization is well-prepared to expand its partnerships.

5. Greater revenue: The updated system contributed to revenue growth by retaining existing customers and attracting new ones thanks to the higher NPS. Additionally, the reduction in customer churn has had a positive impact on the organization’s finances.

This telco organization has successfully eliminated a range of technical and operational challenges through a flexible and future-proof solution. By implementing a real-time event-driven approach, supported by Confluent Cloud as the central platform and HTTP sink connectors for standardized communication, the organization has achieved the above outcomes.

The organization's commitment to embracing modern event-driven architectures and leveraging cutting-edge technologies has not only addressed past pain points but has also positioned it for future success. Right now, efficiency, transparency, and flexibility have translated into enhanced collaboration, customer satisfaction, and the ability to adapt to evolving business demands. This approach has transformed the telco organization's operations, ensuring that it remains agile and responsive in a dynamic business landscape.

Resources: 

  • Ramón Marquez is a Senior Solutions Engineer at Confluent, where he empowers organizations to harness the potential of real-time data streams and event-driven architectures. He has worked with leading companies that prioritize data as a primary asset around the world, including Talend, IBM, and Devo.

Ist dieser Blog-Beitrag interessant? Jetzt teilen

Win the CSP & MSP Markets by Leveraging Confluent’s Data Streaming Platform and OEM Program

This blog explores how cloud service providers (CSPs) and managed service providers (MSPs) increasingly recognize the advantages of leveraging Confluent to deliver fully managed Kafka services to their clients. Confluent enables these service providers to deliver higher value offerings to wider...


Atomic Tessellator: Revolutionizing Computational Chemistry with Data Streaming

With Confluent sitting at the core of their data infrastructure, Atomic Tessellator provides a powerful platform for molecular research backed by computational methods, focusing on catalyst discovery. Read on to learn how data streaming plays a central role in their technology.