Advertisement
Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- python experiment.py
- Using gpu device 0: GeForce GTX TITAN Black
- /usr/local/lib/python2.7/dist-packages/Lasagne-0.1.dev-py2.7.egg/lasagne/init.py:86: UserWarning: The uniform initializer no longer uses Glorot et al.'s approach to determine the bounds, but defaults to the range (-0.01, 0.01) instead. Please use the new GlorotUniform initializer to get the old behavior. GlorotUniform is now the default for all layers.
- warnings.warn("The uniform initializer no longer uses Glorot et al.'s "
- Loading data...
- (168, 39)
- /usr/local/lib/python2.7/dist-packages/Lasagne-0.1.dev-py2.7.egg/lasagne/layers/helper.py:69: UserWarning: get_all_layers() has been changed to return layers in topological order. The former implementation is still available as get_all_layers_old(), but will be removed before the first release of Lasagne. To ignore this warning, use `warnings.filterwarnings('ignore', '.*topo.*')`.
- warnings.warn("get_all_layers() has been changed to return layers in "
- /usr/local/lib/python2.7/dist-packages/theano/scan_module/scan.py:1017: Warning: In the strict mode, all neccessary shared variables must be passed as a part of non_sequences
- 'must be passed as a part of non_sequences', Warning)
- Computing updates...
- Compiling functions...
- /usr/local/lib/python2.7/dist-packages/theano/scan_module/scan_perform_ext.py:133: RuntimeWarning: numpy.ndarray size changed, may indicate binary incompatibility
- from scan_perform.scan_perform import *
- Training...
- 1.37378001213
- 1.32035112381
- 1.32456994057
- 1.32178497314
- 1.32462310791
- 1.32277989388
- 1.32382488251
- 1.32401013374
- 1.32540011406
- 1.32264709473
- 1.3265478611
- 1.32395792007
- 1.32552886009
- 1.32278013229
- 1.32727503777
- 1.3229329586
- 1.32568001747
- 1.32218718529
- 1.32564997673
- 1.32446789742
- Epoch 0 took 1.32446789742, cost = 0.236172556877, error = 0.949728061149
- 1.32659506798
- 1.32414102554
- 1.3248398304
- 1.32406401634
- 1.3254160881
- 1.32578611374
- 1.32594680786
- 1.32349205017
- 1.32587790489
- 1.3234641552
- 1.32493591309
- 1.32370615005
- 1.32686805725
- 1.31945204735
- 1.32277989388
- 1.32013201714
- 1.32330608368
- 1.32456588745
- 1.32740783691
- 1.32403206825
- Epoch 1 took 1.32403206825, cost = nan, error = 0.985447596649
- 1.32458901405
- 1.32266998291
- 1.32541108131
- 1.32362580299
- 1.32612109184
- 1.32395792007
- 1.32669401169
- 1.32302689552
- 1.32604598999
- 1.32323694229
- 1.3291079998
- 1.32212209702
- 1.32700800896
- 1.32311201096
- 1.33942484856
- 1.35423398018
- 1.32460212708
- 1.3217189312
- 1.32654309273
- 1.32117795944
- Epoch 2 took 1.32117795944, cost = nan, error = 0.985447596649
- 1.32549214363
- 1.3265068531
- 1.33263301849
- 1.32968115807
- 1.33673191071
- 1.33021497726
- 1.33883690834
- 1.36176300049
- 1.34264016151
- 1.33736395836
- 1.33167695999
- 1.32827877998
- 1.33537817001
- 1.33314585686
- 1.35636711121
- 1.34423995018
- 1.3395011425
- 1.33643293381
- 1.33976006508
- 1.33662509918
- Epoch 3 took 1.33662509918, cost = nan, error = 0.985447596649
- 1.333589077
- 1.3309340477
- 1.36852192879
- 1.33266496658
- 1.34567189217
- 1.33116602898
- 1.3335518837
- 1.33862900734
- 1.34646511078
- 1.33973097801
- 1.33759403229
- 1.3336250782
- 1.33578515053
- 1.33739089966
- 1.34413218498
- 1.34501600266
- 1.34341812134
- 1.33504104614
- 1.34449505806
- 1.34248209
- Epoch 4 took 1.34248209, cost = nan, error = 0.985447596649
- 1.34506082535
- 1.33236694336
- 1.34146118164
- 1.33226299286
- 1.33261013031
- 1.34553003311
- 1.35776209831
- 1.33256912231
- 1.33536100388
- 1.33282804489
- 1.36304688454
- 1.36115098
- 1.39771103859
- 1.41714501381
- 1.35126900673
- 1.33921909332
- 1.35050702095
- 1.3311150074
- 1.33682894707
- 1.33464884758
- Epoch 5 took 1.33464884758, cost = nan, error = 0.985447596649
- 1.34731292725
- 1.33311104774
- 1.33487915993
- 1.33093595505
- 1.36071395874
- 1.33625793457
- 1.33785796165
- 1.33547019958
- 1.33956599236
- 1.33442282677
- 1.35223317146
- 1.32734799385
- 1.33619403839
- 1.35089898109
- 1.33862018585
- 1.33473300934
- 1.34002017975
- 1.32798314095
- 1.33312296867
- 1.34791517258
- Epoch 6 took 1.34791517258, cost = nan, error = 0.985447596649
- 1.34169697762
- 1.33289194107
- 1.34308600426
- 1.36367511749
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement