Big data science faces challenges related to volume, variety, and velocity of data. Specifically: - Petabytes of structured and unstructured data from 10 million subscribers across 10 touchpoints is too big for traditional data warehouses. This includes event logs, program data, content metadata, and purchase histories. - The data comes in at 140 MB/s (12 TB/day), which is too fast as 95% of the data is dropped currently. - The data has inconsistent structures, fragmented XML, and shifting fields, making it difficult to extract features and implement machine learning algorithms at scale.