Advertisement
Not a member of Pastebin yet?
Sign Up,
it unlocks many cool features!
- Rem CheatGYP version 1.0.
- Rem The simplest AI example imaginable using ML.
- Rem https://pastebin.com/AuDdj1RQ
- Rem Developed and tested in QBjs
- Rem https://qbjs.org/
- Rem We want our system to compute answers to the formula
- Rem y = m * x
- Rem where x is the input,
- Rem m is some multiple that we need to learn from the data,
- Rem and y is the output.
- Rem Step 1: Train the system
- Rem Our training data is just the three times table we
- Rem typically learn in second grade.
- Rem Each line of data is sample input, sample output.
- TrainingData:
- Data 1,3
- Data 2,6
- Data 3,9
- Data 4,12
- Data 5,15
- Data 6,18
- Data 7,21
- Data 8,24
- Data 9,27
- Data 10,30
- Data 11,33
- Data 12,36
- Screen 0
- Cls
- Randomize 0
- m = Rnd(10) ' start with a random weight
- gamma = 1 ' learning rate
- Print "i", "m" ' output a table showing iteration, weight
- Print 0, m
- Restore TrainingData
- For i = 1 To 12 ' loop through the training data
- Read x, y
- erf = (y - m * x) ^ 2 ' error function: \DeclareMathOperator\erf{erf}
- DerfDm = -2 * x * (y - m * x) ' gradient: \frac{d \erf}{d m}
- If DerfDm <> 0 Then
- m = m - gamma * erf / DerfDm ' update weight
- End If
- Print i, m
- Next
- Rem Step 2: Now use the "AI" to answer multiplication questions.
- Input "Enter a number or q to quit"; i$
- While (i$ <> "q") And (i$ <> "Q")
- x = Val(i$)
- Print "The answer is ", m * x
- Input "Enter a number or q to quit"; i$
- Wend
- Rem Things to consider:
- Rem
- Rem Gradient decscent requires a lot of examples to learn.
- Rem Gradient descent converges slowly, if at all.
- Rem Gradient descent only produces approximate results.
- Rem We can fix these issues with various tricks and cheats.
- Rem
- Rem Gradients get hard to compute when m is not just a scalar
- Rem but a matrix of weights or the composition of lots
- Rem of functions and transformations!
- Rem
- Rem This example only learns the multiplier,
- Rem it doesn't actually learn multiplication;
- Rem in fact, it exploits the fact that multiplication
- Rem is built right into the programming language and CPU.
- Rem
- Rem ML learns a lot differently from school children:
- Rem In grammar school, we just memorize these tables.
- Rem We later got an algorithm to multiply bigger numbers.
- Rem We also learn simplifying rules like:
- Rem 0 * n = 0 <-- multiplication by zero
- Rem 1 * n = n <-- multiplicative identity
- Rem m * n = n * m <-- commutativity
- Rem (a + b) * n = a * n + b * n <-- distributivity
- Rem \bar{n} * 10 = \bar{n0} <-- multiplication by 10
- Rem \bar{n} * 11 = \bar{nn} <-- multiplication by 11
- Rem digital root of 9 * n = 9
- Rem We could also just count on our fingers when we got stuck!
Advertisement
Add Comment
Please, Sign In to add comment
Advertisement