Tutorial (français)¶
Exemples¶
(original entry : covid_sird_mixture.py:docstring of aftercovid.models.covid_sird_mixture.CovidSIRDMixture, line 16)
(original entry : covid_sird.py:docstring of aftercovid.models.covid_sird.CovidSIRD, line 13)
(original entry : covid_sird_cst.py:docstring of aftercovid.models.covid_sird_cst.CovidSIRDc, line 80)
Stochastic Gradient Descent applied to linear regression
The following example how to optimize a simple linear regression.
<<<
import numpy
from aftercovid.optim import SGDOptimizer
def fct_loss(c, X, y):
return numpy.linalg.norm(X @ c - y) ** 2
def fct_grad(c, x, y, i=0):
return x * (x @ c - y) * 0.1
coef = numpy.array([0.5, 0.6, -0.7])
X = numpy.random.randn(10, 3)
y = X @ coef
sgd = SGDOptimizer(numpy.random.randn(3))
sgd.train(X, y, fct_loss, fct_grad, max_iter=15, verbose=True)
print('optimized coefficients:', sgd.coef)
>>>
0/15: loss: 14.42 lr=0.1
1/15: loss: 3.113 lr=0.1
2/15: loss: 1.169 lr=0.1
3/15: loss: 0.7378 lr=0.1
4/15: loss: 0.2461 lr=0.1
5/15: loss: 0.08405 lr=0.1
6/15: loss: 0.03233 lr=0.1
7/15: loss: 0.01207 lr=0.1
8/15: loss: 0.01509 lr=0.1
9/15: loss: 0.007304 lr=0.1
10/15: loss: 0.002011 lr=0.1
11/15: loss: 0.004316 lr=0.1
12/15: loss: 0.002956 lr=0.1
13/15: loss: 0.0002909 lr=0.1
14/15: loss: 0.0002441 lr=0.1
15/15: loss: 0.000134 lr=0.1
optimized coefficients: [ 0.503 0.602 -0.701]
(original entry : sgd.py:docstring of aftercovid.optim.sgd.SGDOptimizer, line 32)
Ligne de commande¶
Commande check
¶
check
Vérifie que le module fonctionne comme prévu.
<<<
python -m aftercovid check --help
>>>
--SCRIPT---m aftercovid check --help
--OUT--
--ERR--
INFO: Showing help with the command '__main__.py check -- --help'.
[1mNAME[0m
__main__.py check - Runs a couple of functions to check the module is working.
[1mSYNOPSIS[0m
__main__.py check <flags>
[1mDESCRIPTION[0m
Runs a couple of functions to check the module is working.
[1mFLAGS[0m
-v, --verbose=[4mVERBOSE[0m
Default: 1
0 to hide the standout output
--PATH--
None