My Learning Path & Journey into AI

PART-1

Harsha Rao
3 min readMay 30, 2021
AI Journey Map

30-May-2021 (Sunday)

Today I learnt that it is possible for AI enthusiasts like me to work on Machine Learning (ML) using the Low-Code and No-Code ML Platforms. My greatest fear and apprehensions originates from lack of focussed time to learn full-blown coding on any platform. For less-fortunates like me, Machine learning becomes more accessible when there is less coding or no coding involved.

What is the difference between low-code and no-code?

Lоw-соde simрly stаnds fоr а reduсed аmоunt оf соding. А lоt оf elements саn be simрly drаgged аnd drоррed frоm the librаry. Hоwever, it is аlsо роssible tо сustоmise them by writing yоur оwn соde, whiсh gives inсreаsed flexibility.
Nо-соde рlаtfоrms require nо knоwledge оf рrоgrаmming аt аll. They саn be used by different рeорle like аrtists, teасhers, mаnаgers. They need АI in their wоrk but dоn’t wаnt tо dive deeр intо рrоgrаmming аnd соmрuter sсienсe. Nо-соde sоlutiоns аre quite limited in funсtiоnаlity but аllоw yоu tо build sоmething simрle quiсkly.

In рrасtiсe, the bоrder between nо-соde аnd lоw-соde рlаtfоrms is рretty thin. Рlаtfоrms thаt рrоmоte themselves аs ‘nо-соde’ still usuаlly leаve sоme sрасe fоr сustоmisаtiоn.

Low-code platforms:

PyCaret | AutoViML | H2O AutoML |

No-code platforms:

Google Cloud AutoML | Google ML Kit | Teachable Machine | Runway AI | Lobe | Obviously AI | Create ML | Make ML | Fritz AI | Super Annotate | Rapid Miner | What-If Tool | Data Robot | Nanonets AI | Monkey Learn Studio | Final Words

31-May-2021 (Monday)

Today I learnt about this Machine Learning Model called “Linear Regression”. This model basically if exposed to a large number of inputs and also supplied with the output applicable for them, analyses data, and tries to figure out the relationship between input and the result.

Let me explain with this very popular example.

Before we begin, don’t forget to install scikit-learn, it provides easy to use functions and predefined models which saves a lot of time

pip install scikit-learn

Sample Training Set

Here, X is the input and y is the output.

Given the training set you could easily guess that the output (y) is nothing but (x1 + 2*x2 + 3*x3).

How To Generate Training Set

Python Code:

from random import randint

TRAIN_SET_LIMIT = 1000

TRAIN_SET_COUNT = 100

TRAIN_INPUT = list()

TRAIN_OUTPUT = list()

for i in range(TRAIN_SET_COUNT):

a = randint(0, TRAIN_SET_LIMIT)

b = randint(0, TRAIN_SET_LIMIT)

c = randint(0, TRAIN_SET_LIMIT)

op = a + (2*b) + (3*c)

TRAIN_INPUT.append([a, b, c])

TRAIN_OUTPUT.append(op)

Working with linear regression model is simple. Create a model, train it and then use it :)

Train The Model

We have the training set ready, so create a Linear Regression Model and pass it the training data.

Python Code:

from sklearn.linear_model import LinearRegression

predictor = LinearRegression(n_jobs=-1)

predictor.fit(X=TRAIN_INPUT, y=TRAIN_OUTPUT)

Test Data

X = [[10, 20, 30]]

The outcome should be 10 + 20*2 + 30*3 = 140. Let’s see what we got…

Python Code:

X_TEST = [[10, 20, 30]]

outcome = predictor.predict(X=X_TEST)

coefficients = predictor.coef_

print(‘Outcome : {}\nCoefficients : {}’.format(outcome, coefficients))

Outcome : [ 140.]
Coefficients : [ 1. 2. 3.]

Did you notice what just happened? The model had access to the training data, through which it calculated the weights to assign to the inputs to arrive at the desired output. On giving test data, it successfully managed to get the right answer!

--

--