Advertisement
themoosemind

skaae / experiment.py

Jun 10th, 2015
579
0
Never
Not a member of Pastebin yet? Sign Up, it unlocks many cool features!
text 3.98 KB | None | 0 0
  1. python experiment.py
  2. Using gpu device 0: GeForce GTX TITAN Black
  3. /usr/local/lib/python2.7/dist-packages/Lasagne-0.1.dev-py2.7.egg/lasagne/init.py:86: UserWarning: The uniform initializer no longer uses Glorot et al.'s approach to determine the bounds, but defaults to the range (-0.01, 0.01) instead. Please use the new GlorotUniform initializer to get the old behavior. GlorotUniform is now the default for all layers.
  4. warnings.warn("The uniform initializer no longer uses Glorot et al.'s "
  5. Loading data...
  6. (168, 39)
  7. /usr/local/lib/python2.7/dist-packages/Lasagne-0.1.dev-py2.7.egg/lasagne/layers/helper.py:69: UserWarning: get_all_layers() has been changed to return layers in topological order. The former implementation is still available as get_all_layers_old(), but will be removed before the first release of Lasagne. To ignore this warning, use `warnings.filterwarnings('ignore', '.*topo.*')`.
  8. warnings.warn("get_all_layers() has been changed to return layers in "
  9. /usr/local/lib/python2.7/dist-packages/theano/scan_module/scan.py:1017: Warning: In the strict mode, all neccessary shared variables must be passed as a part of non_sequences
  10. 'must be passed as a part of non_sequences', Warning)
  11. Computing updates...
  12. Compiling functions...
  13. /usr/local/lib/python2.7/dist-packages/theano/scan_module/scan_perform_ext.py:133: RuntimeWarning: numpy.ndarray size changed, may indicate binary incompatibility
  14. from scan_perform.scan_perform import *
  15. Training...
  16. 1.37378001213
  17. 1.32035112381
  18. 1.32456994057
  19. 1.32178497314
  20. 1.32462310791
  21. 1.32277989388
  22. 1.32382488251
  23. 1.32401013374
  24. 1.32540011406
  25. 1.32264709473
  26. 1.3265478611
  27. 1.32395792007
  28. 1.32552886009
  29. 1.32278013229
  30. 1.32727503777
  31. 1.3229329586
  32. 1.32568001747
  33. 1.32218718529
  34. 1.32564997673
  35. 1.32446789742
  36. Epoch 0 took 1.32446789742, cost = 0.236172556877, error = 0.949728061149
  37. 1.32659506798
  38. 1.32414102554
  39. 1.3248398304
  40. 1.32406401634
  41. 1.3254160881
  42. 1.32578611374
  43. 1.32594680786
  44. 1.32349205017
  45. 1.32587790489
  46. 1.3234641552
  47. 1.32493591309
  48. 1.32370615005
  49. 1.32686805725
  50. 1.31945204735
  51. 1.32277989388
  52. 1.32013201714
  53. 1.32330608368
  54. 1.32456588745
  55. 1.32740783691
  56. 1.32403206825
  57. Epoch 1 took 1.32403206825, cost = nan, error = 0.985447596649
  58. 1.32458901405
  59. 1.32266998291
  60. 1.32541108131
  61. 1.32362580299
  62. 1.32612109184
  63. 1.32395792007
  64. 1.32669401169
  65. 1.32302689552
  66. 1.32604598999
  67. 1.32323694229
  68. 1.3291079998
  69. 1.32212209702
  70. 1.32700800896
  71. 1.32311201096
  72. 1.33942484856
  73. 1.35423398018
  74. 1.32460212708
  75. 1.3217189312
  76. 1.32654309273
  77. 1.32117795944
  78. Epoch 2 took 1.32117795944, cost = nan, error = 0.985447596649
  79. 1.32549214363
  80. 1.3265068531
  81. 1.33263301849
  82. 1.32968115807
  83. 1.33673191071
  84. 1.33021497726
  85. 1.33883690834
  86. 1.36176300049
  87. 1.34264016151
  88. 1.33736395836
  89. 1.33167695999
  90. 1.32827877998
  91. 1.33537817001
  92. 1.33314585686
  93. 1.35636711121
  94. 1.34423995018
  95. 1.3395011425
  96. 1.33643293381
  97. 1.33976006508
  98. 1.33662509918
  99. Epoch 3 took 1.33662509918, cost = nan, error = 0.985447596649
  100. 1.333589077
  101. 1.3309340477
  102. 1.36852192879
  103. 1.33266496658
  104. 1.34567189217
  105. 1.33116602898
  106. 1.3335518837
  107. 1.33862900734
  108. 1.34646511078
  109. 1.33973097801
  110. 1.33759403229
  111. 1.3336250782
  112. 1.33578515053
  113. 1.33739089966
  114. 1.34413218498
  115. 1.34501600266
  116. 1.34341812134
  117. 1.33504104614
  118. 1.34449505806
  119. 1.34248209
  120. Epoch 4 took 1.34248209, cost = nan, error = 0.985447596649
  121. 1.34506082535
  122. 1.33236694336
  123. 1.34146118164
  124. 1.33226299286
  125. 1.33261013031
  126. 1.34553003311
  127. 1.35776209831
  128. 1.33256912231
  129. 1.33536100388
  130. 1.33282804489
  131. 1.36304688454
  132. 1.36115098
  133. 1.39771103859
  134. 1.41714501381
  135. 1.35126900673
  136. 1.33921909332
  137. 1.35050702095
  138. 1.3311150074
  139. 1.33682894707
  140. 1.33464884758
  141. Epoch 5 took 1.33464884758, cost = nan, error = 0.985447596649
  142. 1.34731292725
  143. 1.33311104774
  144. 1.33487915993
  145. 1.33093595505
  146. 1.36071395874
  147. 1.33625793457
  148. 1.33785796165
  149. 1.33547019958
  150. 1.33956599236
  151. 1.33442282677
  152. 1.35223317146
  153. 1.32734799385
  154. 1.33619403839
  155. 1.35089898109
  156. 1.33862018585
  157. 1.33473300934
  158. 1.34002017975
  159. 1.32798314095
  160. 1.33312296867
  161. 1.34791517258
  162. Epoch 6 took 1.34791517258, cost = nan, error = 0.985447596649
  163. 1.34169697762
  164. 1.33289194107
  165. 1.34308600426
  166. 1.36367511749
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement