SlideShare uma empresa Scribd logo
1 de 13
Content-based recommendation
 The requirement
 some information about the available items such as the genre ("content")
 some sort of user profile describing what the user likes (the preferences)
• “Similarity” is computed from item attributes, e.g.,
• Similarity of movies by actors, director, genre
• Similarity of text by words, topics
• Similarity of music by genre, year
 The task:
 learn user preferences
 locate/recommend items that are "similar" to the user preferences
"show me more
of the same
what I've liked"
• Most Content Based-recommendation techniques were applied to recommending
text documents.
• Like web pages or newsgroup messages for example.
• Content of items can also be represented as text documents.
• With textual descriptions of their basic characteristics.
• Structured: Each item is described by the same set of attributes
Title Genre Author Type Price Keywords
The Night of
the Gun
Memoir David Carr Paperback 29.90 Press and journalism, drug addiction,
personal memoirs, New York
The Lace
Reader
Fiction,
Mystery
Brunonia Barry Hardcover 49.90 American contemporary fiction, detective,
historical
Into the Fire Romance,
Suspense
Suzanne
Brockmann
Hardcover 45.90 American fiction, murder, neo-Nazism
 Item representation
Content representation and item similarities
• Approach
• Compute the similarity of an unseen item with
the user profile based on the keyword overlap
(e.g. using the Dice coefficient)
• Or use and combine multiple metrics
Title Genre Author Type Price Keywords
The Night of
the Gun
Memoir David Carr Paperback 29.90 Press and journalism, drug
addiction, personal memoirs,
New York
The Lace
Reader
Fiction,
Mystery
Brunonia Barry Hardcover 49.90 American contemporary fiction,
detective, historical
Into the Fire Romance,
Suspense
Suzanne
Brockmann
Hardcover 45.90 American fiction, murder, neo-
Nazism
 User profile
Title Genre Author Type Price Keywords
… Fiction Brunonia,
Barry, Ken
Follett
Paperback 25.65 Detective, murder,
New York
𝟐 × 𝒌𝒆𝒚𝒘𝒐𝒓𝒅𝒔 𝒃𝒊 ∩ 𝒌𝒆𝒚𝒘𝒐𝒓𝒅𝒔 𝒃𝒋
𝒌𝒆𝒚𝒘𝒐𝒓𝒅𝒔 𝒃𝒊 + 𝒌𝒆𝒚𝒘𝒐𝒓𝒅𝒔 𝒃𝒋
𝑘𝑒𝑦𝑤𝑜𝑟𝑑𝑠 𝑏𝑗
describes Book 𝑏𝑗
with a set of
keywords
Term-Frequency - Inverse Document
Frequency (𝑻𝑭 − 𝑰𝑫𝑭)
• Simple keyword representation has its problems
• in particular when automatically extracted as
• not every word has similar importance
• longer documents have a higher chance to have an overlap with the user profile
• Standard measure: TF-IDF
• Encodes text documents in multi-dimensional Euclidian space
• weighted term vector
• TF: Measures, how often a term appears (density in a document)
• assuming that important terms appear more often
• normalization has to be done in order to take document length into account
• IDF: Aims to reduce the weight of terms that appear in all documents
• Given a keyword 𝑖 and a document 𝑗
• 𝑇𝐹 𝑖, 𝑗
• term frequency of keyword 𝑖 in document 𝑗
• 𝐼𝐷𝐹 𝑖
• inverse document frequency calculated as 𝑰𝑫𝑭 𝒊 = 𝒍𝒐𝒈
𝑵
𝒏 𝒊
• 𝑁 : number of all recommendable documents
• 𝑛 𝑖 : number of documents from 𝑁 in which keyword 𝑖 appears
• 𝑇𝐹 − 𝐼𝐷𝐹
• is calculated as: 𝑻𝑭-𝑰𝑫𝑭 𝒊, 𝒋 = 𝑻𝑭 𝒊, 𝒋 ∗ 𝑰𝑫𝑭 𝒊
Term-Frequency - Inverse Document
Frequency (𝑻𝑭 − 𝑰𝑫𝑭)
Cosine similarity
• Usual similarity metric to compare vectors: Cosine similarity (angle)
• Cosine similarity is calculated based on the angle between the vectors
• 𝑠𝑖𝑚 𝑎, 𝑏 =
𝑎∙𝑏
𝑎 ∗ 𝑏
• Adjusted cosine similarity
• take average user ratings into account ( 𝑟𝑢), transform the original ratings
• U: set of users who have rated both items a and b
• 𝑠𝑖𝑚 𝑎, 𝑏 = 𝑢∈𝑈 𝑟 𝑢,𝑎− 𝑟 𝑢 𝑟 𝑢,𝑏− 𝑟 𝑢
𝑢∈𝑈 𝑟 𝑢,𝑎− 𝑟 𝑢
2
𝑢∈𝑈 𝑟 𝑢,𝑏− 𝑟 𝑢
2
An example for computing cosine similarity of annotations
To calculate cosine similarity between two texts t1 and t2, they are
transformed
in vectors as shown in the Table
Probabilistic methods
Calculation of probabilities in simplistic approach
Item1 Item2 Item3 Item4 Item5
Alice 1 3 3 2 ?
User1 2 4 2 2 4
User2 1 3 3 5 1
User3 4 5 2 3 3
User4 1 1 5 2 1
X = (Item1 =1, Item2=3, Item3= … )
Item1 Item5
Alice 2 ?
User1 1 2
 Idea of Slope One predictors is simple and is based on a popularity
differential between items for users
 Example:
 p(Alice, Item5) =
 Basic scheme: Take the average of these differences of the co-ratings to
make the prediction
 In general: Find a function of the form f(x) = x + b
Slope One predictors
-
2 + ( 2 - 1 ) = 3
Relevant
Nonrelevant
• Most learning methods aim to find coefficients of a linear model
• A simplified classifier with only two dimensions can be represented by a line
 Other linear classifiers:
– Naive Bayes classifier, Rocchio method, Windrow-Hoff algorithm, Support vector machines
Linear classifiers
 The line has the form 𝒘 𝟏 𝒙 𝟏 + 𝒘 𝟐 𝒙 𝟐 = 𝒃
– 𝑥1 and 𝑥2 correspond to the vector
representation of a document (using e.g. TF-IDF
weights)
– 𝑤1, 𝑤2 and 𝑏 are parameters to be learned
– Classification of a document based on checking
𝑤1 𝑥1 + 𝑤2 𝑥2 > 𝑏
 In n-dimensional space the classification
function is 𝑤 𝑇 𝑥 = 𝑏
– Mean Absolute Error (MAE) computes the deviation
between predicted ratings and actual ratings
– Root Mean Square Error (RMSE) is similar to MAE,
but places more emphasis on larger deviation
Metrics Measure Error rate
Next …
• Hybrid recommendation systems
• More theories
• Boolean and Vector Space
Retrieval Models
• Clustering
• Data mining
• And so on

Mais conteúdo relacionado

Mais procurados

Recommender Systems! @ASAI 2011
Recommender Systems! @ASAI 2011Recommender Systems! @ASAI 2011
Recommender Systems! @ASAI 2011
Ernesto Mislej
 

Mais procurados (20)

Collaborative Filtering using KNN
Collaborative Filtering using KNNCollaborative Filtering using KNN
Collaborative Filtering using KNN
 
Recommender Systems
Recommender SystemsRecommender Systems
Recommender Systems
 
Collaborative filtering
Collaborative filteringCollaborative filtering
Collaborative filtering
 
Movie Recommendation engine
Movie Recommendation engineMovie Recommendation engine
Movie Recommendation engine
 
Recommendation Systems Basics
Recommendation Systems BasicsRecommendation Systems Basics
Recommendation Systems Basics
 
Movie lens recommender systems
Movie lens recommender systemsMovie lens recommender systems
Movie lens recommender systems
 
Movie lens movie recommendation system
Movie lens movie recommendation systemMovie lens movie recommendation system
Movie lens movie recommendation system
 
Recommender Systems
Recommender SystemsRecommender Systems
Recommender Systems
 
Recommender Systems (Machine Learning Summer School 2014 @ CMU)
Recommender Systems (Machine Learning Summer School 2014 @ CMU)Recommender Systems (Machine Learning Summer School 2014 @ CMU)
Recommender Systems (Machine Learning Summer School 2014 @ CMU)
 
Recommendation System Explained
Recommendation System ExplainedRecommendation System Explained
Recommendation System Explained
 
Collaborative Filtering Recommendation System
Collaborative Filtering Recommendation SystemCollaborative Filtering Recommendation System
Collaborative Filtering Recommendation System
 
Recommender Systems! @ASAI 2011
Recommender Systems! @ASAI 2011Recommender Systems! @ASAI 2011
Recommender Systems! @ASAI 2011
 
Movie recommendation project
Movie recommendation projectMovie recommendation project
Movie recommendation project
 
An introduction to Recommender Systems
An introduction to Recommender SystemsAn introduction to Recommender Systems
An introduction to Recommender Systems
 
Introduction to Recommendation Systems
Introduction to Recommendation SystemsIntroduction to Recommendation Systems
Introduction to Recommendation Systems
 
[Final]collaborative filtering and recommender systems
[Final]collaborative filtering and recommender systems[Final]collaborative filtering and recommender systems
[Final]collaborative filtering and recommender systems
 
Content based recommendation systems
Content based recommendation systemsContent based recommendation systems
Content based recommendation systems
 
Recommendation system
Recommendation system Recommendation system
Recommendation system
 
A Hybrid Recommendation system
A Hybrid Recommendation systemA Hybrid Recommendation system
A Hybrid Recommendation system
 
Recommendation system
Recommendation systemRecommendation system
Recommendation system
 

Destaque

Systemy rekomendacji
Systemy rekomendacjiSystemy rekomendacji
Systemy rekomendacji
Adam Kawa
 

Destaque (6)

How To Implement a CMS
How To Implement a CMSHow To Implement a CMS
How To Implement a CMS
 
How to build a Recommender System
How to build a Recommender SystemHow to build a Recommender System
How to build a Recommender System
 
How to Build Recommender System with Content based Filtering
How to Build Recommender System with Content based FilteringHow to Build Recommender System with Content based Filtering
How to Build Recommender System with Content based Filtering
 
Rekomendujemy - Szybkie wprowadzenie do systemów rekomendacji oraz trochę wie...
Rekomendujemy - Szybkie wprowadzenie do systemów rekomendacji oraz trochę wie...Rekomendujemy - Szybkie wprowadzenie do systemów rekomendacji oraz trochę wie...
Rekomendujemy - Szybkie wprowadzenie do systemów rekomendacji oraz trochę wie...
 
Systemy rekomendacji
Systemy rekomendacjiSystemy rekomendacji
Systemy rekomendacji
 
Collaborative filtering for recommendation systems in Python, Nicolas Hug
Collaborative filtering for recommendation systems in Python, Nicolas HugCollaborative filtering for recommendation systems in Python, Nicolas Hug
Collaborative filtering for recommendation systems in Python, Nicolas Hug
 

Semelhante a Content based filtering

Publish or Perish: Towards a Ranking of Scientists using Bibliographic Data ...
Publish or Perish:  Towards a Ranking of Scientists using Bibliographic Data ...Publish or Perish:  Towards a Ranking of Scientists using Bibliographic Data ...
Publish or Perish: Towards a Ranking of Scientists using Bibliographic Data ...
Lior Rokach
 
Simple semantics in topic detection and tracking
Simple semantics in topic detection and trackingSimple semantics in topic detection and tracking
Simple semantics in topic detection and tracking
George Ang
 

Semelhante a Content based filtering (20)

Book Recommendation Engine
Book Recommendation EngineBook Recommendation Engine
Book Recommendation Engine
 
sa-mincut-aditya.ppt
sa-mincut-aditya.pptsa-mincut-aditya.ppt
sa-mincut-aditya.ppt
 
sa.ppt
sa.pptsa.ppt
sa.ppt
 
sa-mincut-aditya.ppt
sa-mincut-aditya.pptsa-mincut-aditya.ppt
sa-mincut-aditya.ppt
 
Recommenders.ppt
Recommenders.pptRecommenders.ppt
Recommenders.ppt
 
Recommenders.ppt
Recommenders.pptRecommenders.ppt
Recommenders.ppt
 
Publish or Perish: Towards a Ranking of Scientists using Bibliographic Data ...
Publish or Perish:  Towards a Ranking of Scientists using Bibliographic Data ...Publish or Perish:  Towards a Ranking of Scientists using Bibliographic Data ...
Publish or Perish: Towards a Ranking of Scientists using Bibliographic Data ...
 
Digital Image Processing.pptx
Digital Image Processing.pptxDigital Image Processing.pptx
Digital Image Processing.pptx
 
Scalable Recommendation Algorithms with LSH
Scalable Recommendation Algorithms with LSHScalable Recommendation Algorithms with LSH
Scalable Recommendation Algorithms with LSH
 
Simple semantics in topic detection and tracking
Simple semantics in topic detection and trackingSimple semantics in topic detection and tracking
Simple semantics in topic detection and tracking
 
Filtering content bbased crs
Filtering content bbased crsFiltering content bbased crs
Filtering content bbased crs
 
IRT Unit_ 2.pptx
IRT Unit_ 2.pptxIRT Unit_ 2.pptx
IRT Unit_ 2.pptx
 
Presentation on Text Classification
Presentation on Text ClassificationPresentation on Text Classification
Presentation on Text Classification
 
Textmining Retrieval And Clustering
Textmining Retrieval And ClusteringTextmining Retrieval And Clustering
Textmining Retrieval And Clustering
 
Textmining Retrieval And Clustering
Textmining Retrieval And ClusteringTextmining Retrieval And Clustering
Textmining Retrieval And Clustering
 
Textmining Retrieval And Clustering
Textmining Retrieval And ClusteringTextmining Retrieval And Clustering
Textmining Retrieval And Clustering
 
[系列活動] 人工智慧與機器學習在推薦系統上的應用
[系列活動] 人工智慧與機器學習在推薦系統上的應用[系列活動] 人工智慧與機器學習在推薦系統上的應用
[系列活動] 人工智慧與機器學習在推薦系統上的應用
 
Models for Information Retrieval and Recommendation
Models for Information Retrieval and RecommendationModels for Information Retrieval and Recommendation
Models for Information Retrieval and Recommendation
 
Textual Document Categorization using Bigram Maximum Likelihood and KNN
Textual Document Categorization using Bigram Maximum Likelihood and KNNTextual Document Categorization using Bigram Maximum Likelihood and KNN
Textual Document Categorization using Bigram Maximum Likelihood and KNN
 
Text features
Text featuresText features
Text features
 

Último

Histor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slideHistor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slide
vu2urc
 
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptxEIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
Earley Information Science
 

Último (20)

Histor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slideHistor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slide
 
Presentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreterPresentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreter
 
Scaling API-first – The story of a global engineering organization
Scaling API-first – The story of a global engineering organizationScaling API-first – The story of a global engineering organization
Scaling API-first – The story of a global engineering organization
 
How to convert PDF to text with Nanonets
How to convert PDF to text with NanonetsHow to convert PDF to text with Nanonets
How to convert PDF to text with Nanonets
 
Data Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt RobisonData Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt Robison
 
Tech Trends Report 2024 Future Today Institute.pdf
Tech Trends Report 2024 Future Today Institute.pdfTech Trends Report 2024 Future Today Institute.pdf
Tech Trends Report 2024 Future Today Institute.pdf
 
GenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day PresentationGenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day Presentation
 
🐬 The future of MySQL is Postgres 🐘
🐬  The future of MySQL is Postgres   🐘🐬  The future of MySQL is Postgres   🐘
🐬 The future of MySQL is Postgres 🐘
 
Finology Group – Insurtech Innovation Award 2024
Finology Group – Insurtech Innovation Award 2024Finology Group – Insurtech Innovation Award 2024
Finology Group – Insurtech Innovation Award 2024
 
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptxEIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
EIS-Webinar-Prompt-Knowledge-Eng-2024-04-08.pptx
 
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
04-2024-HHUG-Sales-and-Marketing-Alignment.pptx
 
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
Mastering MySQL Database Architecture: Deep Dive into MySQL Shell and MySQL R...
 
Strategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
Strategize a Smooth Tenant-to-tenant Migration and Copilot TakeoffStrategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
Strategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
 
The 7 Things I Know About Cyber Security After 25 Years | April 2024
The 7 Things I Know About Cyber Security After 25 Years | April 2024The 7 Things I Know About Cyber Security After 25 Years | April 2024
The 7 Things I Know About Cyber Security After 25 Years | April 2024
 
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
Apidays Singapore 2024 - Building Digital Trust in a Digital Economy by Veron...
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected Worker
 
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
 
2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...
 
Boost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivityBoost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivity
 
Evaluating the top large language models.pdf
Evaluating the top large language models.pdfEvaluating the top large language models.pdf
Evaluating the top large language models.pdf
 

Content based filtering

  • 1. Content-based recommendation  The requirement  some information about the available items such as the genre ("content")  some sort of user profile describing what the user likes (the preferences) • “Similarity” is computed from item attributes, e.g., • Similarity of movies by actors, director, genre • Similarity of text by words, topics • Similarity of music by genre, year  The task:  learn user preferences  locate/recommend items that are "similar" to the user preferences "show me more of the same what I've liked"
  • 2. • Most Content Based-recommendation techniques were applied to recommending text documents. • Like web pages or newsgroup messages for example. • Content of items can also be represented as text documents. • With textual descriptions of their basic characteristics. • Structured: Each item is described by the same set of attributes Title Genre Author Type Price Keywords The Night of the Gun Memoir David Carr Paperback 29.90 Press and journalism, drug addiction, personal memoirs, New York The Lace Reader Fiction, Mystery Brunonia Barry Hardcover 49.90 American contemporary fiction, detective, historical Into the Fire Romance, Suspense Suzanne Brockmann Hardcover 45.90 American fiction, murder, neo-Nazism
  • 3.  Item representation Content representation and item similarities • Approach • Compute the similarity of an unseen item with the user profile based on the keyword overlap (e.g. using the Dice coefficient) • Or use and combine multiple metrics Title Genre Author Type Price Keywords The Night of the Gun Memoir David Carr Paperback 29.90 Press and journalism, drug addiction, personal memoirs, New York The Lace Reader Fiction, Mystery Brunonia Barry Hardcover 49.90 American contemporary fiction, detective, historical Into the Fire Romance, Suspense Suzanne Brockmann Hardcover 45.90 American fiction, murder, neo- Nazism  User profile Title Genre Author Type Price Keywords … Fiction Brunonia, Barry, Ken Follett Paperback 25.65 Detective, murder, New York 𝟐 × 𝒌𝒆𝒚𝒘𝒐𝒓𝒅𝒔 𝒃𝒊 ∩ 𝒌𝒆𝒚𝒘𝒐𝒓𝒅𝒔 𝒃𝒋 𝒌𝒆𝒚𝒘𝒐𝒓𝒅𝒔 𝒃𝒊 + 𝒌𝒆𝒚𝒘𝒐𝒓𝒅𝒔 𝒃𝒋 𝑘𝑒𝑦𝑤𝑜𝑟𝑑𝑠 𝑏𝑗 describes Book 𝑏𝑗 with a set of keywords
  • 4. Term-Frequency - Inverse Document Frequency (𝑻𝑭 − 𝑰𝑫𝑭) • Simple keyword representation has its problems • in particular when automatically extracted as • not every word has similar importance • longer documents have a higher chance to have an overlap with the user profile • Standard measure: TF-IDF • Encodes text documents in multi-dimensional Euclidian space • weighted term vector • TF: Measures, how often a term appears (density in a document) • assuming that important terms appear more often • normalization has to be done in order to take document length into account • IDF: Aims to reduce the weight of terms that appear in all documents
  • 5. • Given a keyword 𝑖 and a document 𝑗 • 𝑇𝐹 𝑖, 𝑗 • term frequency of keyword 𝑖 in document 𝑗 • 𝐼𝐷𝐹 𝑖 • inverse document frequency calculated as 𝑰𝑫𝑭 𝒊 = 𝒍𝒐𝒈 𝑵 𝒏 𝒊 • 𝑁 : number of all recommendable documents • 𝑛 𝑖 : number of documents from 𝑁 in which keyword 𝑖 appears • 𝑇𝐹 − 𝐼𝐷𝐹 • is calculated as: 𝑻𝑭-𝑰𝑫𝑭 𝒊, 𝒋 = 𝑻𝑭 𝒊, 𝒋 ∗ 𝑰𝑫𝑭 𝒊 Term-Frequency - Inverse Document Frequency (𝑻𝑭 − 𝑰𝑫𝑭)
  • 6. Cosine similarity • Usual similarity metric to compare vectors: Cosine similarity (angle) • Cosine similarity is calculated based on the angle between the vectors • 𝑠𝑖𝑚 𝑎, 𝑏 = 𝑎∙𝑏 𝑎 ∗ 𝑏 • Adjusted cosine similarity • take average user ratings into account ( 𝑟𝑢), transform the original ratings • U: set of users who have rated both items a and b • 𝑠𝑖𝑚 𝑎, 𝑏 = 𝑢∈𝑈 𝑟 𝑢,𝑎− 𝑟 𝑢 𝑟 𝑢,𝑏− 𝑟 𝑢 𝑢∈𝑈 𝑟 𝑢,𝑎− 𝑟 𝑢 2 𝑢∈𝑈 𝑟 𝑢,𝑏− 𝑟 𝑢 2
  • 7. An example for computing cosine similarity of annotations To calculate cosine similarity between two texts t1 and t2, they are transformed in vectors as shown in the Table
  • 9. Calculation of probabilities in simplistic approach Item1 Item2 Item3 Item4 Item5 Alice 1 3 3 2 ? User1 2 4 2 2 4 User2 1 3 3 5 1 User3 4 5 2 3 3 User4 1 1 5 2 1 X = (Item1 =1, Item2=3, Item3= … )
  • 10. Item1 Item5 Alice 2 ? User1 1 2  Idea of Slope One predictors is simple and is based on a popularity differential between items for users  Example:  p(Alice, Item5) =  Basic scheme: Take the average of these differences of the co-ratings to make the prediction  In general: Find a function of the form f(x) = x + b Slope One predictors - 2 + ( 2 - 1 ) = 3
  • 11. Relevant Nonrelevant • Most learning methods aim to find coefficients of a linear model • A simplified classifier with only two dimensions can be represented by a line  Other linear classifiers: – Naive Bayes classifier, Rocchio method, Windrow-Hoff algorithm, Support vector machines Linear classifiers  The line has the form 𝒘 𝟏 𝒙 𝟏 + 𝒘 𝟐 𝒙 𝟐 = 𝒃 – 𝑥1 and 𝑥2 correspond to the vector representation of a document (using e.g. TF-IDF weights) – 𝑤1, 𝑤2 and 𝑏 are parameters to be learned – Classification of a document based on checking 𝑤1 𝑥1 + 𝑤2 𝑥2 > 𝑏  In n-dimensional space the classification function is 𝑤 𝑇 𝑥 = 𝑏
  • 12. – Mean Absolute Error (MAE) computes the deviation between predicted ratings and actual ratings – Root Mean Square Error (RMSE) is similar to MAE, but places more emphasis on larger deviation Metrics Measure Error rate
  • 13. Next … • Hybrid recommendation systems • More theories • Boolean and Vector Space Retrieval Models • Clustering • Data mining • And so on