If two attributes in a Decision Tree have the same entropy, the attribute with the __________ Gini Index would generally be preferred.
- Equal
- Higher
- Lower
- Random
If two attributes in a Decision Tree have the same entropy, the attribute with the lower Gini Index would generally be preferred. A lower Gini Index indicates a purer node and would typically result in a better split.
Loading...
Related Quiz
- In K-Means clustering, a common approach to avoid local minima due to initial centroid selection is to run the algorithm multiple times with different _________.
- How can feature scaling affect the performance of certain Machine Learning algorithms?
- What is the fundamental goal of Simple Linear Regression?
- What are the main types of Machine Learning?
- A company wants to deploy a machine learning model for hiring. They've ensured that the model is highly accurate. However, they're facing criticism because the inner workings of their model are a "black box," and candidates want to know why they were or were not selected. This criticism mainly pertains to which aspect of machine learning?