TensorFlow for JavaScript for ClojureScript

Among the many announcements at the TensorFlow Dev Summit was the announcement of TensorFlow for JavaScript and I of course wanted to play with that ... from ClojureScript.

These are the steps I took to get a simple polynomial regression example working in cljs. I created a re-frame template app but that's not important. I just needed a place to keep the code and I liked having a button to press to fire the function.

First add a tag to your html file to load the tfjs file.

<script src="https://cdn.jsdelivr.net/npm/@tensorflow/tfjs@0.6.1"></script>

At first I started with using just JavaScript inter-op but quickly got tired of typing (.bar js/tf (clj->js foo)) so I created a couple of convenience functions. Not sure this is was really worth it and I'm sure someone will come up with a better and more complete wrapper soon.

(defn tensor
  ([data]
   (.tensor js/tf (clj-&gt;js data)))
  ([data shape]
   (.tensor js/tf (clj-&gt;js data) (clj-&gt;js shape))))

(defn scalar
  ([data]
   (.scalar js/tf (clj-&gt;js data))))

(defn tensor1d
  ([data]
   (.tensor1d js/tf (clj-&gt;js data)))
  ([data shape]
   (.tensor1d js/tf (clj-&gt;js data) (clj-&gt;js shape))))

(defn tensor2d
  ([data]
   (.tensor2d js/tf (clj-&gt;js data)))
  ([data shape]
   (.tensor2d js/tf (clj-&gt;js data) (clj-&gt;js shape))))

(defn tensor3d
  ([data]
   (.tensor3d js/tf (clj-&gt;js data)))
  ([data shape]
   (.tensor3d js/tf (clj-&gt;js data) (clj-&gt;js shape))))

(defn tensor4d
  ([data]
   (.tensor4d js/tf (clj-&gt;js data)))
  ([data shape]
   (.tensor4d js/tf (clj-&gt;js data) (clj-&gt;js shape))))

(defn zeros [shape]
  (.zeros js/tf (clj-&gt;js shape)))

(defn ones [shape]
  (.ones js/tf (clj-&gt;js shape)))

(defn variable [tensor-data]
  (.variable js/tf tensor-data))

Then in the callback for a button click I put the meat of the code. This is not the best way to organize the code but again I just wanted to explore the framework.

The first function generate-data evaluates the polynomial a * x^3 + b * x^2 + c *x + d for a variety of values for x and returns the xs and ys.

The polynomial-regression function sets up our TensorFlow variables and the prediction, loss and training functions we'll need. The predict function evaluates the polynomial using the current coefficient variables. The loss function calculates the mean squared error between the predicted values and the true sampled values.

variables are important (and non Clojure-y) in that they start out random and TensorFlow will adjust their values iteratively in an effort to find the best values that minimize the error.

The adjustments are made in the opposite direction of error gradient in proportion to the learning rate. Another way that variables are important is that TensorFlow keeps track of how they were calculated so that it can automatically calculate the gradient. That's the main difference between a variable and a tensor.

The optimizer handles the forward pass which calculates the predictions and the error/loss and the backward pass which calculates the gradients and makes the appropriate adjustments.

We learn better and better values during each iteration so we run many iterations until we feel our error is close enough or is no longer getting better.

(defn generate-data [n a b c d]
  (let [eval-poly (fn [x]
                    (+ (* a (* x x x))
                       (* b (* x x))
                       (* c x)
                       d))
        xs (range -1.0 1.0 (/ 2 (dec n))) 
        ys (map eval-poly xs)]
    [xs, ys]))

(defn polynomial-regression []
  (let [true-a 1.0
        true-b -2.0
        true-c 3.0
        true-d -4.0
        num-samples 20
        [xs ys] (generate-data num-samples true-a true-b true-c true-d)

        ; The coefficients we want to learn
        a (variable (scalar (rand)))
        b (variable (scalar (rand)))
        c (variable (scalar (rand)))
        d (variable (scalar (rand)))

        ; Convenience function to print out the coefficients
        var-vals (fn [] [(aget (.dataSync a) 0)
                         (aget (.dataSync b) 0)
                         (aget (.dataSync c) 0)
                         (aget (.dataSync d) 0)])

        ; Our prediction function
        predict (fn [x]
                  (.tidy js/tf
                         (fn []
                           (-&gt; (.mul a (.pow x (scalar 3)))
                               (.add (.mul b (.square x)))
                               (.add (.mul c x))
                               (.add d)))))

        ; Our loss function
        loss (fn [preds labels]
               (-&gt; (.sub preds labels)
                   (.square)
                   (.mean)))

        ; Our training loop
        train (fn [xs ys num-iter]
                (let [learning-rate 0.5
                      optimizer (.sgd js/tf.train learning-rate)]
                  
                  (dotimes [iter num-iter]
                    (if (zero? (mod iter 10))
                      (print "Iter: " iter "Loss: " (aget (.dataSync (loss (predict xs) ys)) 0) (var-vals)))
                    (.minimize optimizer #(loss (predict xs) ys)))))
        
        num-iter 100]

    (println "xs " xs)
    (println "ys " ys)
    (println "Variables/Coefficients Before " (var-vals))
    (train (tensor1d xs) (tensor1d ys) num-iter)
    (println "Variables/Coefficients After " (var-vals))))

I added a call back to a button to be able to call it at will.

[:button {:on-click (fn [e] (polynomial-regression))} "Push Me"]]))

This the output of one of the runs. We know the true values of the coefficients are 1, -2, 3, -4. We can see the xs and ys that are generated and then every 10 iteration we can see the current error/loss and the current values for the coefficients.

After 100 iterations we can see the coefficients are pretty close to their known true values.

Some interesting experiments would be to generate more sample data, run more iterations, add noise to the y values and develop a single page app that plots the sampled data and the calculated polynomial. ... so many cool projects so little time.

xs  (-1 -0.8947368421052632 -0.7894736842105263 -0.6842105263157895 -0.5789473684210527 -0.4736842105263158 -0.368421052631579 -0.26315789473684215 -0.1578947368421053 -0.052631578947368474 0.05263157894736836 0.1578947368421052 0.26315789473684204 0.36842105263157887 0.4736842105263157 0.5789473684210525 0.6842105263157894 0.7894736842105262 0.894736842105263 0.9999999999999999)
ys  (-10 -9.001603732322497 -8.107012684064733 -7.309228750546727 -6.601253827088497 -5.97608980901006 -5.426738591631434 -4.946202070272634 -4.527482140253682 -4.163580696894591 -3.847499635515381 -3.5722408514360695 -3.330806239976673 -3.11619769645721 -2.9214171161976967 -2.7394663945181517 -2.563347426738592 -2.386062108179035 -2.2006123341594983 -2.0000000000000004)
Variables/Coefficients Before  [0.851018488407135 0.6453003287315369 0.7353794574737549 0.04515934735536575]
Iter:  0 Loss:  28.009965896606445 [0.851018488407135 0.6453003287315369 0.7353794574737549 0.04515934735536575]
Iter:  10 Loss:  0.030018558725714684 [1.782896637916565 -1.6789995431900024 2.4509975910186768 -4.130495071411133]
Iter:  20 Loss:  0.013966093771159649 [1.6366788148880005 -1.880030632019043 2.5543694496154785 -4.048770904541016]
Iter:  30 Loss:  0.008406300097703934 [1.5174497365951538 -1.9551630020141602 2.637821674346924 -4.018227577209473]
Iter:  40 Loss:  0.005438276566565037 [1.420548439025879 -1.9832427501678467 2.705645799636841 -4.006812572479248]
Iter:  50 Loss:  0.003576194401830435 [1.3417935371398926 -1.9937372207641602 2.7607686519622803 -4.0025458335876465]
Iter:  60 Loss:  0.002359963720664382 [1.2777866125106812 -1.99765944480896 2.8055689334869385 -4.000951766967773]
Iter:  70 Loss:  0.0015585235087201 [1.2257661819458008 -1.9991252422332764 2.841979742050171 -4.0003557205200195]
Iter:  80 Loss:  0.0010294165695086122 [1.1834874153137207 -1.9996728897094727 2.8715717792510986 -4.000133037567139]
Iter:  90 Loss:  0.00067995983408764 [1.1491260528564453 -1.999877691268921 2.8956222534179688 -4.000049591064453]
Variables/Coefficients After  [1.121199607849121 -1.999954342842102 2.9151690006256104 -4.00001859664917]

Conclusion

I'm really excited about being able to use TensorFlow in a browser and eventually in a node app. And its great to see how easy it is to use from ClojureScript. I look forward to seeing what kind of libraries people come up with to make it even easier and what people build with it. I also look forward to using the Keras style deep learning part of TensorFlow.js

What do you think? Let me know what you want to build.

Want to get notified of new articles and insights?