Skip to content
#

batch-gradient-descent

Here are 41 public repositories matching this topic...

This repository contains a project that demonstrates how to perform sentiment analysis on Twitter data using Apache Spark, including data preprocessing, feature engineering, model training, and evaluation.

  • Updated Jul 9, 2024
  • Jupyter Notebook

Gradient Descent is a technique used to fine-tune machine learning algorithms with differentiable loss functions. It's an open-ended mathematical expression, tirelessly calculating the first-order derivative of a loss function and making precise parameter adjustments.

  • Updated Apr 26, 2024
  • Jupyter Notebook
machine-learning

Python machine learning applications in image processing, recommender system, matrix completion, netflix problem and algorithm implementations including Co-clustering, Funk SVD, SVD++, Non-negative Matrix Factorization, Koren Neighborhood Model, Koren Integrated Model, Dawid-Skene, Platt-Burges, Expectation Maximization, Factor Analysis, ISTA, F…

  • Updated Dec 16, 2022
  • Jupyter Notebook

Improve this page

Add a description, image, and links to the batch-gradient-descent topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the batch-gradient-descent topic, visit your repo's landing page and select "manage topics."

Learn more