1 of 7

Final Project Proposal

Final Project Proposal

Final Project Proposal

Final Project Proposal

Final Project Proposal

Final Project Proposal

Final Project Proposal

Final Project Proposal

Final Project Proposal

Final Project Proposal

Final Project Proposal

Seun Elemo

2 of 7

Mutual information

Mutual Information tells us how much knowing one thing reduces uncertainty about another by measuring the amount of information shared between two variables.

For example:

  • If you have two variables, like a message's content and its sentiment, MI can show how much the content helps predict the sentiment.
  • If MI is high, it means the two variables are closely related (knowing one gives you a lot of information about the other).
  • If MI is low or zero, it means they’re mostly independent (knowing one doesn’t tell you much about the other).

3 of 7

https://www.youtube.com/watch?v=eJIp_mgVLwE

4 of 7

https://www.youtube.com/watch?v=eJIp_mgVLwE

5 of 7

https://www.youtube.com/watch?v=eJIp_mgVLwE

Joint probabilities is the prob. that two things happen at the same time

Marginal probabilities is the prob. that

one thing happens

6 of 7

https://www.youtube.com/watch?v=eJIp_mgVLwE

7 of 7

Keyword Trend Analyzer

  • Description: Identify which words in a message thread are most related to specific topics (like "work" or "friends").

Message Similarity Calculator

  • Description: Compare two individual text messages to see how similar they are in terms of word usage.

Sentiment Word Extractor

  • Description: Find the most common words in positive or negative messages from a conversation.