Machine Learning Feature Selection in Python

4.0
68 個評分
提供方
Coursera Project Network
3,740 人已註冊
在此指導項目中,您將:

Demonstrate univariate filtering methods of feature selection such as SelectKBest

Demonstrate wrapper-based feature selection methods such as Recursive Feature Elimination

Demonstrate feature importance estimation, dimensionality reduction, and lasso regularization techniques

Clock2 hours
Intermediate中級
Cloud無需下載
Video分屏視頻
Comment Dots英語(English)
Laptop僅限桌面

In this 1-hour long project-based course, you will learn basic principles of feature selection and extraction, and how this can be implemented in Python. Together, we will explore basic Python implementations of Pearson correlation filtering, Select-K-Best knn-based filtering, backward sequential filtering, recursive feature elimination (RFE), estimating feature importance using bagged decision trees, lasso regularization, and reducing dimensionality using Principal Component Analysis (PCA). We will focus on the simplest implementation, usually using Scikit-Learn functions. All of this will be done on Ubuntu Linux, but can be accomplished using any Python I.D.E. on any operating system. We will be using the IDLE development environment to demonstrate several feature selection techniques using the publicly available Pima Diabetes dataset. I would encourage learners to experiment using these techniques not only for feature selection, but hyperparameter tuning as well. Note: This course works best for learners who are based in the North America region. We’re currently working on providing the same experience in other regions.

您要培養的技能

Data ScienceFeature SelectionFeature EngineeringFeature ExtractionFeature Scaling

分步進行學習

在與您的工作區一起在分屏中播放的視頻中,您的授課教師將指導您完成每個步驟:

  1. Defining Terms relating to Feature Selection and Dimensionality Reduction

  2. Introduce Algorithms with Embedded Feature Selection

  3. Demonstrate two Univariate Selection Methods: Pearson Correlation Filtering and SelectKBest f_classif

  4. Demonstrate two Wrapper Methods: Backward Sequential and RFE

  5. Demonstrate Feature Importance Estimation using Bagged Decision Trees

  6. Dimensionality Reduction using Principal Component Analysis

  7. Demonstrate Lasso Regularization

  8. Expanding concepts to hyperparameter optimization and model selection

指導項目工作原理

您的工作空間就是瀏覽器中的雲桌面,無需下載

在分屏視頻中,您的授課教師會為您提供分步指導

常見問題

常見問題

還有其他問題嗎?請訪問 學生幫助中心