Table of Contents
Communications service providers are going to have to deploy real-time data analytics solutions if they are to optimise their businesses. This framework report provides an overview of key analytics issues and case studies that are relevant to the telecoms sector. It also provides guidance on the different use cases and the major vendors that are active within the market.
Analytics solutions need to scale to meet the demand for delivering results in real time while using large data sets and complex models
Today's analytics tools have been developed from the business intelligence tools of the past that were concerned with reporting what has already occurred. This may include the running of complex models to provide derived information that is used within KPIs or other business measurements.
Predicative analytics tools model future outcomes based on historical patterns. Highly skilled staff are able to create models based on an understanding of the data attributors and the potential outcomes.
In-line analytics tools overcome the time constraints of running models on stored data by being updated with real-time information. This enables models to react to live information and update live processes where needed. For example, it is possible to address in real time events such as network configurations or selling services to active users at a location or on a website.
Definitions of components in an analytics framework
Segment or sub-segment: Data
Unstructured, semi-structured and structured data that is used within the analytics model. This data can be pulled from any data source and specifically measured using probes or diagnostics tools. Operational systems such as billing, customer relationship management (CRM) or enterprise resource planning (ERP) as well as network data such as IP detail records (IPDRs) or CDRs are often used, but transient data such as location are increasingly being tracked.
Segment or sub-segment: Extract, transform, load (ETL)
ETL processes are three functions often combined into a single tool.
• Extract: reads data from a specified source database and extracts a desired subset of data.
• Transform: manipulates the data using rules or lookup tables, or creates combinations with other data sources to convert it to the desired state.
• Load: writes the resulting data (either all of the subset or just the changes) to a target database, which may or may not exist as a data warehouse or enterprise data warehouse, data marts, online analytical processing (OLAP) applications or „cubes?, or other business intelligence or analytics application tools.
ETL functions are increasingly being replaced with ELT or ETLT tools to reduce data loads on the network and provide faster execution. There is also much value to being able to store the large volumes of raw data.
Sample vendors and solutions:
Informatica, IBM InfoSphere DataStage
Also, but not dedicated to the function: Ab Initio, IBM Cognos, Microsoft SQL Server Integration Services (SSIS), SAP Business Objects, SAS Institute
Segment or sub-segment: Data infrastructure
Storage, servers and associated networking infrastructure. Historically, these have been the preserve of established vendors in the market, but the advent of unstructured data has created a new class of devices and data store. The open-source Apache Hadoop processing infrastructure has become popular. This builds on established massively parallel processing, which uses multiple loosely coupled processors to work on different parts of a programme.
Solutions such as those offered by Aster (Teradata), IBM Netezza, Oracle Exadata, SAP HANA and Vertica can be used in conjunction with Hadoop.
Sample vendors and solutions:
Apache Hadoop, Cloudera, Dell, EMC, Hortonworks, IBM, MapR, SAP HANA, Teradata
Get Industry Insights. Simply.
Talk to Ahmad
+1 718 618 4302