GRNET announces, in the context of SmartAttica EDIH (European Digital Innovation Hub), the 7th Module of Τraining modules for SMEs with the subject "Ensemble Learning Techniques"

Date: May 27th, 2025, at 12:00 EET  

Location: Online via Zoom and On-site in the GRNET offices

Presentation Languages: Greek

Instructor: Dr. Nikolaos Bakas (GRNET)

Description: Join us for an in-depth seminar on advanced ensemble learning techniques, focusing on Random Forests and XGBoost. This seminar will cover the theoretical foundations, practical implementations, and computer code in these powerful machine-learning methods. We will also explore hyperparameter tuning strategies to optimize model performance.

Target Audience: This seminar is designed for data scientists, machine learning engineers, and software developers who are interested in enhancing their understanding of ensemble learning techniques. 

Learning Objectives:

  • Understand the principles and workings of Random Forests and XGBoost.

  • Learn how to implement these algorithms using Python and popular libraries.

  • Explore the impact of hyperparameter tuning on model performance.

  • Gain insights into practical applications and case studies.

Prerequisites: Participants should have a basic understanding of machine learning concepts and experience with Python programming. Familiarity with decision trees and ensemble methods will be beneficial but not required.

Indicative Contents:

  1. Introduction to Ensemble Learning

    • Overview of ensemble methods

    • Benefits and challenges

  2. Random Forests

    • Theory and construction of decision trees

    • Bagging and bootstrapping techniques

    • Random feature selection

    • Implementation and case studies

  3. XGBoost

    • Introduction to boosting

    • Gradient boosting framework

    • Advantages of XGBoost over traditional methods

    • Practical implementation and examples

  4. Hyperparameter Tuning

    • Importance of hyperparameter optimization

    • Techniques for tuning Random Forests and XGBoost

    • Tools and libraries for automated tuning

  5. Hands-on Session

    • Implementing Random Forests and XGBoost in Python

    • Experimenting with hyperparameter tuning

    • Analyzing results and improving model performance

  6. Conclusion and Q&A

    • Recap of key concepts

    • Open floor for questions and discussion

The project is co-funded by the European Union. Views and opinions expressed are however those of the author(s) only and do not necessarily reflect those of the European Union or the European Commission. Neither the European Union nor the granting authority can be held responsible for them.

Starts
Ends
Europe/Athens