Experiment
Q. What is the Experiment Module?
A. The Experiment Module is a feature of the MlAngles MLOps Platform that enables users to train and evaluate machine learning models using transformed datasets from pipelines.
Q. How do I access the Experiment Module?
A. The steps are:
- From the left-hand sidebar, click the Experiment Module icon.
- The Experiment Homepage will open, displaying all experiment runs along with key details such as run name, project, pipeline, version, status, creator, and timestamps.
Q. Can I customize algorithms before starting a run?
A. Yes. By clicking on the Configure Algorithm button, you can:
- Select an ML library.
- Choose one or more algorithms.
- Submit your configuration before starting the experiment run.
Q. How can I create a new experiment run?
A. There are two ways:
- From the Experiment Homepage
- Use the Project Selection dropdown to choose a project.
- Click + New Run to go to the Experiment Run Design page.
- From the Project Workspace
- Click on the desired Project Title Card.
- Navigate to the Experiment Tracking tab.
- Click New Run.
Q. Does the system provide recommended configurations automatically?
A. Yes. Once a pipeline is selected, the system’s integrated Large Language Model (LLM) analyzes the pipeline metadata and auto-fills recommended parameters to speed up setup.
Q. What is the benefit of using auto-filled parameters?
A. These defaults allow users—especially beginners—to launch experiments quickly with well-optimized configurations, while advanced users can still customize them before execution.
Q. What is the benefit of using auto-filled parameters?
A. These defaults allow users—especially beginners—to launch experiments quickly with well-optimized configurations, while advanced users can still customize them before execution.
Q. Can I manually tune hyperparameters instead of relying on defaults?
A. Yes. Advanced users can choose to manually configure hyperparameter tuning. The platform supports:
- Optuna – Automated, efficient hyperparameter optimization.
- Hyperopt – Distributed asynchronous search.
- Grid Search – Exhaustive search over a specified hyperparameter space.
Users must define the range or set of values for each parameter to optimize.
Q. How do I launch an experiment after configuration?
A. After selecting defaults or customizing settings:
- Click Submit to queue the experiment.
- The system schedules execution and redirects to the Project’s Experiment Page.
- From there, users can:
- Monitor real-time progress.
- View logs and status updates (Running, Failed, Success).
- Access trained models and outputs once the run is complete.
Q. Where can I view experiment results?
A . To view results of any experiment run:
- Navigate to the Project’s Experiment Page.
- Locate the desired run.
- Click the Run Name to open the Run Details Panel
Q. How do I register a model for deployment?
A. Steps for deployment :
- Open the Artifacts tab.
- Click Model Hub → Select a model.
- Provide a Registration Name.
- Click Register. The model is then available in the Model Hub Module for deployment.