×
INTELLIGENT WORK FORUMS
FOR COMPUTER PROFESSIONALS

Are you a
Computer / IT professional?
Join Tek-Tips Forums!
• Talk With Other Members
• Be Notified Of Responses
• Keyword Search
Favorite Forums
• Automated Signatures
• Best Of All, It's Free!

*Tek-Tips's functionality depends on members receiving e-mail. By joining you are opting in to receive e-mail.

#### Posting Guidelines

Promoting, selling, recruiting, coursework and thesis posting is forbidden.

# finding specific conditional entropy (for calculating informationgain)

## finding specific conditional entropy (for calculating informationgain)

(OP)
I am trying to learn how to calculate information gain, and have hit a brick wall. Gain(Y,X) = entropy(Y) - entropy(Y|X)

The first term, entropy(y), is easy.

But, the entropy(Y|X) is the problem...

so entropy(Y|X) = SUM (prob[x] * entropy(Y|X = x)), over all values of Y.

But what is entropy(Y|X=x)? How do you find it? I have seen nothing online that explains this, and it seems crucial to calculating information gain.

#### Red Flag This Post

Please let us know here why this post is inappropriate. Reasons such as off-topic, duplicates, flames, illegal, vulgar, or students posting their homework.

#### Red Flag Submitted

Thank you for helping keep Tek-Tips Forums free from inappropriate posts.
The Tek-Tips staff will check this out and take appropriate action.

Close Box

# Join Tek-Tips® Today!

Join your peers on the Internet's largest technical computer professional community.
It's easy to join and it's free.

Here's Why Members Love Tek-Tips Forums:

• Talk To Other Members
• Notification Of Responses To Questions
• Favorite Forums One Click Access
• Keyword Search Of All Posts, And More...

Register now while it's still free!