Deep Neural Networks With OpenCV and Clojure on AWS Lambda


Deep Neural Networks with opencv

Learn more about Deep Neural Networks with OpenCV and Clojure

In our previous post, we managed to run a Yolo-based Deep Neural Network directly on a Raspberry Pi with object detection in semi-real-time on pictures and video streams. The processing was done locally, which is kind of optimum for a local video stream. But, it can be a little bit too power-hungry if you have a farm of these.

Here are some not-so-easy-to-get power consumption values for the Raspberry Pi. You can easily see that heavy CPU usage doubles energy consumption. In that case, a possible solution to offload processing out the Raspberry and onto servers is by using easy-to-set-up lambdas.

You may also like: Raspberry Pi, OpenCV, Deep Neural Networks, and — Of Course— a Bit of Clojure

You’ve probably tried AWS Lambdas before, and some of you might even be using some in production, they are stateless functions, handler, deployed, and hosted remotely on AWS infrastructure.

In this post, we are going to create and deploy (obviously) Clojure based lambdas, and call them from our beloved Raspberry Pi to analyze some pictures using Neural Networks via Origami.

The lambda will be developed and deployed from a local development machine, and will finally be called from the Raspberry Pi, as shown in the diagram below:

To interact with and set up AWS services, one usually goes through the AWS command-line interface.

To install awscli, you can use apt:

sudo apt install awscli

But depending on your luck and timing, you may end up with a slightly old version and may want to try the Python-based install instead:

sudo pip3 install awscli --upgrade

Once you have the AWS command ready, make sure to configure it with one of your IAM user.

Configure the AWS IAM User

In the AWS console, we simply set up a new IAM user; here named nico.

We also create a role, lando, to which we give full lambda access:

Then, retrieve the key ID and the Access Key of that new user. We can now run the AWS configuration from the Raspberry Pi and our development machine.

Where you have to answer the usual questions on:

  • AWS Access Key ID
  • AWS Secret Access Key
  • Default region name
  • Default output format

That’s all in the AWS console, for now.

First Lamba in Clojure

This is a really brief summary of the existing documentation on the AWS website about writing and deploying a lambda in Clojure.

Here, we will create a new Clojure project using lein, write a function that can be called by the lambda framework, and deploy it. We’ll call our little project lando.

“How you doin’, ya old pirate? … 

You’re used to it, so let’s make creating a new Clojure project brief:

lein new lando

Which gives us the project structure shown below:

├── project.clj
├── resources
├── src
│ └── lando
│ ├── core.clj

The project.clj file doesn’t need anything special, and so in short:

(defproject lando "0.1.0-SNAPSHOT" ... :dependencies [ [org.clojure/clojure "1.10.0"] ] :main lando.core :profiles { :uberjar {:aot :all} } :repl-options {:init-ns lando.core})

Note that there is no extra AWS specific library needed … for now. Also, note that we are forcing ahead-of-time compilation so that the Java classes are generated at compile time.

Now, on to the core.clj file itself, again, this is pretty much a copy of the AWS documentation, so nothing new here. We use gen-class to create the Java function that will be called by the lambda framework.

Here, the function will take a String and return a String, both are Java objects. The core of the function is self-explanatory; it returns a string of “Hello ” and the content of the string passed in the parameter.

(ns lando.core (:gen-class :methods [^:static [handler [String] String]])) (defn -handler [s] (str "Hello " s "!"))

We’re done. Let’s create a jar file out of this that can be deployed onto AWS Lambda.

lein uberjar

We now have a nice jar file in the target folder:

ls -al target/lando-0.1.0-SNAPSHOT-standalone.jar

To deploy our function, we use the AWS CLI and the create-function subcommand:

aws lambda create-function --function-name core \ --handler lando.core::handler \ --runtime java8 \ --memory 512 \ --timeout 20 \ --zip-file fileb://./target/lando-0.1.0-SNAPSHOT-standalone.jar \ --role arn:aws:iam::817572757560:role/lando 

To check that all went fine, you can run:

aws lambda list-functions

And check that the output contains the newly deployed function:

{ "Functions": [ { "FunctionName": "core", "FunctionArn": "arn:aws:lambda:ap-northeast-1:817572757560:function:core", "Runtime": "java8", "Role": "arn:aws:iam::817572757560:role/lando", "Handler": "lando.core::handler", "CodeSize": 22140085, "Description": "", "Timeout": 20, "MemorySize": 512, "LastModified": "2019-09-20T07:56:11.644+0000", "CodeSha256": "KvnPmrSwEjWcUvDbhjy2dE2+VxhmjnAHqa2ghzhatMg=", "Version": "$LATEST", "TracingConfig": { "Mode": "PassThrough" }, "RevisionId": "d699f624-6c6a-4e24-a8aa-15f4ad8da5cc" } ]

Usually, if something goes bad at this stage, it’s because of a lack of AWS permission, or a typo in the naming of the handler function.

When using multiple profiles, make sure the one in use is the one you want by setting the AWS_PROFILE  environment variable.

export AWS_PROFILE="default"

The most complicated is done; now, we can just call the lambda using the subcommand invoke.

Recalling our deployment graph from earlier on, we can now switch from the development machine to our Raspberry Pi. Calling the function is done via invoke.

aws lambda invoke --function-name lando --payload '["Lando"]' lando.log

And the status of the call is shown on completion:

{ "StatusCode": 200, "ExecutedVersion": "$LATEST"

The result of the call itself is in the lando.log file:

$ cat lando.log "Hello Lando!"

“How you doin’, ya old pirate? … 

Spicing Things Up by Calling OpenCV/Origami

Let’s see how it goes by using our favorite Clojure imaging library, origami.

Our example will be quite easy. We just output the OpenCV version in use, which also means that all required native libraries have been loaded properly and can be loaded from within the lambda context.

In the dependencies section of project.clj, let’s add the new dependency:

:dependencies [ [org.clojure/clojure "1.10.0"] [origami "4.1.1-6"] ]

And let’s creates a new namespace, origami, within the same lando project, with the following code:

(ns lando.origami (:require [opencv4.core :refer :all]) (:gen-class :methods [^:static [handler [String] String]])) (defn -handler [s] (str "Using OpenCV Version: " VERSION ".."))

Nothing annoying yet in the code. We can run lein uberjar to create the jar file, and then run a  create-function from the CLI using the same role.

aws lambda create-function --function-name origami \ --handler lando.origami::handler \ --runtime java8 \ --memory 512 \ --timeout 20 \ --zip-file fileb://./target/lando-0.1.0-SNAPSHOT-standalone.jar \ --role arn:aws:iam::817572757560:role/lando 

This time, things did not go too well at first glance… and we’re stuck with:

An error occurred (RequestEntityTooLargeException) when calling the CreateFunction operation: Request must be smaller than 69905067 bytes for the CreateFunction operation

Let’s check the size of the generated jar file:

$ ls -alh target/lando-0.1.0-SNAPSHOT-standalone.jar -rw-r--r-- 1 niko staff 95M Sep 20 17:05 target/lando-0.1.0-SNAPSHOT-standalone.jar

That would be a little bit over the 70M limit enforced by AWS. So, something you may not know is that origami has a version of OpenCV for each different platform it is supposed to run on. This way, everyone can get started very quickly, but it makes a resulting standalone jar file rather large.

$ unzip -l target/lando-0.1.0-SNAPSHOT-standalone.jar | grep natives 0 12-17-2018 10:27 natives/linux_32/ 0 08-02-2019 11:18 natives/linux_64/ 39529136 08-02-2019 11:18 natives/linux_64/ 0 08-14-2019 10:11 natives/linux_arm/ 25907540 08-14-2019 10:12 natives/linux_arm/ 0 08-02-2019 13:54 natives/linux_arm64/ 32350952 08-02-2019 13:55 natives/linux_arm64/ 0 08-01-2019 14:47 natives/osx_64/ 82392636 08-01-2019 14:47 natives/osx_64/libopencv_java411.dylib 0 12-17-2018 10:27 natives/windows_32/ 0 08-01-2019 16:36 natives/windows_64/ 55030784 08-01-2019 16:37 natives/windows_64/opencv_java411.dll

Since lambdas are running on Linux, we can focus on just keeping the compiled library needed, i.e linux_64/

Let’s slim things a bit:

 zip -d target/lando-0.1.0-SNAPSHOT-standalone.jar \ natives/windows_64/opencv_java411.dll \ natives/osx_64/libopencv_java411.dylib \ natives/linux_arm64/ \ natives/linux_arm/ 

And check the size of the standalone jar again:

$ ls -alh target/lando-0.1.0-SNAPSHOT-standalone.jar -rw-r--r-- 1 niko staff 21M Sep 20 17:14 target/lando-0.1.0-SNAPSHOT-standalone.jar

Way better. Now, we can redeploy the origami function again, and call it:

aws lambda invoke --function-name origami --payload '""' origami.log
$ cat origami.log "Using OpenCV Version: 4.1.1.."

Running Our Deep Neural Network on Lambda

You’re probably used to it now, if you follow the series of posts on running a DNN with Clojure, we’ll pretty much just wrap the previously made code and call it from our lambda handler. This time, the function will create the network and run it on the input of the lambda as a URL of the picture we want to find out what’s inside. We’ll put this code in a new namespace named lando.dnn this time.

(ns lando.dnn (:require [ :as yolo] [opencv4.core :refer :all] [opencv4.utils :as u] [origami-dnn.core :as origami-dnn]) (:gen-class :methods [^:static [handler [String] String]])) (defn result! [result labels]
(let [img (first result) detected (second result)] (map #(let [{confidence :confidence label :label box :box} %] {:label (nth labels label) :confidence confidence }) detected))) (defn run-yolo [ input ]
(let [[net opts labels] (origami-dnn/read-net-from-repo "networks.yolo:yolov2-tiny:1.0.0")] (println "Running yolo on image:" input) (-> input (u/mat-from-url) (yolo/find-objects net) (result! labels)))) (defn -handler [s] (apply str (run-yolo s)))

In the code snippet above, note that we are only interested in this example on found objects, their label, and the confidence associated with the found object. The result! method is there to format the result returned by  find-objects, and replace the usual  blue-boxes that draw directly on the picture.

The project now requires a new dependency on  origami-dnn that can simply be added with:

:dependencies [ [org.clojure/clojure "1.10.0"] [origami-dnn "0.1.5"] ]

This time, when deploying, we actually do need to specify a Java environment variable that is used by origami-dnn  to expand the files of the network to somewhere more convenient, here  /tmpwhich is kept over between multiple of the same lambda, thus avoiding the retrieval of files over and over again.

This is done using the  JAVA_TOOL_OPTIONS with the switch  –environment.

aws lambda create-function --function-name dnn \ --handler lando.dnn::handler \ --runtime java8 \ --memory 512 \ --timeout 20 \ --zip-file fileb://./target/lando-0.1.0-SNAPSHOT-standalone.jar \ --role arn:aws:iam::817572757560:role/lando \ --environment Variables="{JAVA_TOOL_OPTIONS=-Dnetworks.local=/tmp}" 

The function needs a picture, so obviously we’ll find a picture of a cat …

And feed it to the function call.

aws lambda invoke \
--function-name dnn \
--payload '""' \

And we’ll be happy to know that this is a cat:

$ cat dnn.log "{:label \"cat\", :confidence 0.7528027}"

Great. You can try this with a few different pictures, and you may also notice that even though the first run was quite slow, as it needed to fetch and expand the compiled network files, the subsequent run can re-use those files, and then, the network is loaded quite instantly.

Running the DNN on a List of Images & Return JSON

“You might want to buckle up, baby.”

The last example builds on the previous image detection lambda but takes an array of images as an input. Then, it returns a well-formatted JSON answer containing detection results for each of the input.

Clojure’s Cheshire will be added to project.clj in order to be used to generate the resulting JSON.

 [cheshire "5.9.0"]

And we’ll create a  lando.dnn2 namespace, which starts as a copy paste of  lando.dnn, and brings a few updates.

First, the generated class has a slightly different method signature; we tell the framework that the input will be a list, not a string.

(:gen-class :methods [^:static [handler [java.util.List] String]])

Second, we move the preparation of the network outside the declaration of the sub-functions, to make things slightly faster.

(let [[net opts labels] (origami-dnn/read-net-from-repo "networks.yolo:yolov2-tiny:1.0.0")] (defn result! [result labels] ...) (defn run-yolo [ input ] ...) )

Finally, we use Cheshire’s  generate-string to return properly formatted JSON.

(generate-string (doall (map run-yolo s)))

Finally, the full Clojure namespace gives us:

(ns lando.dnn2 (:require [ :as yolo] [opencv4.core :refer :all] [opencv4.utils :as u] [origami-dnn.core :as origami-dnn] [cheshire.core :refer [generate-string]]) (:gen-class :methods [^:static [handler [java.util.List] String]])) (let [[net opts labels] (origami-dnn/read-net-from-repo "networks.yolo:yolov2-tiny:1.0.0")] (defn result! [result labels] (let [img (first result) detected (second result)] (doall (map #(let [{confidence :confidence label :label box :box} %] {:label (nth labels label) :confidence confidence }) detected)))) (defn run-yolo [ input ] (println "Running yolo on image:" input) (-> input (u/mat-from-url) (yolo/find-objects net) (result! labels)))) (defn -handler [s] (println (first s)) (generate-string (doall (map run-yolo s))))

Deployment is the exact same as before, except with the updated function name. Now, we have dnn2 with the handler being: lando.dnn2. 

The payload to send to the function is also super long. But basically, this is an array with different images that we want to run object detection on.

aws lambda invoke \
--function-name dnn2 \
--payload '["","","",""]' \

The first call is quite slow due to the fact that the lambda needs to retrieve and expand the network files:

$ cat dnn2.log "[[{\"label\":\"dog\",\"confidence\":0.7746848}],[{\"label\":\"cat\",\"confidence\":0.6056155}],[{\"label\":\"cat\",\"confidence\":0.6056155}],[{\"label\":\"cat\",\"confidence\":0.78931814}]]"

And in the end, it’s all about cats and dogs…

Hope you enjoyed! Let us know your thoughts in the comments section!

Further Reading

How to Deploy OpenCV on Raspberry Pi and Enable Machine Vision

Raspberry Pi, OpenCV, Deep Neural Networks, and — Of Course — a Bit of Clojure

OpenCV Scripting With Clojure on Raspberry Pi

This UrIoTNews article is syndicated fromDzone

About Post Author