miércoles, 15 de febrero de 2017

SAP d-shop’s Virtual House – A journey from Physical to Virtual


This post was originally posted on SAP d-shop’s Virtual House – A journey from Physical to Virtual.





Some time ago…our good friends from SAP d-shop Newtown Square (Namely John Astill et all) built a IoT House for SAP Insurance. This little house (hand made by the way) used an Arduino Nano, a bunch of sensors and LED lights…and…which is pretty cool by the way…a 3D Printed washing machine with a water sensor…and of course…it was and it is…IoT enabled.




We thought it was pretty cool…so we have one at our own SAP d-shop at Silicon Valley and it had become a key part in all our d-shop tours.

Then…some time later, our friends from HCP Marketing (Namely Joe Binkley et all) and Intel build a Smart Building. A really nice building…controlled by Amazon Alexa that used an Intel Galileo, some Arduinos as well as servos, lights, a solar panel and even a fan…everything again…IoT enabled…but also as you may have guessed…voice controlled…so you can send the elevator up and down…open or closed the doors and even send the whole building on emergency mode…gladly…we had keep it on the d-shop for quite some time and it’s another of our “wow” factor demos every time someone comes to visit…




Having these two available for us…slowly sparked the fire of innovation and creativity…why don’t we build a Virtual House that can be used on the Oculus Rift and it’s controlled by Alexa?


Not an easy thing…but we for sure love challenges…and thanks to our previous experience Unity3D and Alexa working together we already knew how to start…



The architecture is pretty simple… The Heroku server is just an echo server, so it will repeat everything we pass to it as a JSON response. Our Unity app is constantly checking the Heroku server to see if there’s a message to respond to. Of course, for this to work as intended, we need to setup a skill on Amazon Alexa just to update the server. So, when we say “open door”, then Alexa will send a command to the Heroku server and this server then will produce an “open door” message in JSON. Our Unity app will read the Heroku Server, and act accordingly by opening the door…of course, we don’t want this to happened all over…so after Unity executes the action it sent a null message to the Heroku server, so next time the JSON response is going to be null as well and Unity will simply wait for the next valid command.

If you want to take a sneak peak of how the Virtual House looks like…here are a couple of screenshots…but don’t forget to watch the video, J You will get the full experience -;)





Now…this project started as a “Project in a Box” (for internal only…sorry about that) which means…all the source code and explanations on how to build it from the scratch should be provided…but…for obvious reasons…that didn’t happened L So instead…we turned this into a “Product in a Box” meaning that (Sorry again…internals only) you can download the compiled application and simply edit the configuration file to have it running on your own J No source code is provided by obviously a nice email can get you that -;)

Grab it from here

Now…that I got your full attention…please watch the video J It’s a nice journey from the IoT House to the Virtual House passing by the Smart Building…



Now…you may wonder about the 3D Models used for this Virtual House…as you can see on one of the images, most of them were downloaded but some of them were developed in house J using Blender…Like the Amazon Echo, the 3D Printed Robot and name tags, the Amazon Echo and obviously the house itself J For some other things like the plants and tables…those were imported into Blender and “hand painted” as the textures were not available.

Now…something that we believe it’s pretty important…is to list all the Pain Points and lessons learned while developing this application…

Pain Points and lesson learned:

  • As this is a Product in a Box and not a Project in a Box, we’re not going to include the source code for this application, but what we’re going to do instead is let you know the pain points and lessons learned that came from this project.
  • Unity uses the .NET Framework 3.5, which is already deprecated by .NET 4.0 so many things are not going to work simply because they haven’t been implemented…and why is that? Well…Unity uses Mono (which is .NET for Linux) and I guess they do it to maintain uniformity in all platforms. While Mono remains on .NET 3.5, Unity will not likely upgrade either.
  • When loading scenes, the lighting gets all messed up…so you start in level one…more to level two and suddenly it looks like nighttime…the solution to that is simple…choose “Window à Lighting à Lightmaps”, uncheck the “Auto” checkbox and press “Build” to bake the light again.
  • Coroutines are simply awesome. Normally, you can’t make your application wait or sleep…but by using Coroutines you certainly can…Coroutines are like threads.
  • When using a light, make sure it’s turn off while the character is not in the room, because this will save some graphic processing and because even virtually…we need to be environment aware…
  • Unity doesn’t have a wrap function or property for 3D Text…which is kind of problematic especially if you want to do a Twitter Wall…so your only chance is to build you own…although that’s not that hard…simply grab the incoming text, split it by space into an array…concatenate each word by checking first if the length of the string is lower than our threshold (which should be the maximum number of characters that fit where our 3D text is), is the string is bigger than the threshold, we simply add a carriage return (“\n”) before doing the concatenation.
  • As your application grows you might feel the need to duplicate some assets, which is perfectly fine and doesn’t add too much processing (Especially if you create a Prefab and use that prefab), but don’t forget to assign them unique names, otherwise you’re going to have a headache if you application needs to interact with those assets.
  • Sometimes you will download some 3D models from the web…other times you will create them using Blender…but don’t forget that sometimes just a simple sphere, cube or any other Unity primitive can work just fine by just using an image attached to it as its texture.
  • When creating your Alexa skill…make sure not to make any spelling mistake…otherwise you will hit you head thinking why Alexa is doing what you’re asking her to do…
  • When testing our your application both Debug.log() and Print() will become your best friends…nothing better than a printed value or message to realize what going wrong.
  • When moving an object, always make sure to record its original position and then add the new value to that recorded position. Otherwise, something might provoke the values to go wrong…by having the original values recorded, you avoid having to recalculate the position but just call that variable and get things where they belong.
  • When using 3D Text you will notice that even if you put another object in front of it…it will be always visible…which is not very likely…so we have two options…either create a shader to occlude it…or the easiest one…make the material that it’s in front of it transparent. That’s not perfect for all situations but at least will work.
  • The biggest problems when making Unity and Alexa speak, is that when you ask Alexa to turn on the lights, she will respond “The lights are on” …but then if you ask a second time her response should be “The lights are already on” …to make this…we should need to use a Database or something to store state information…and when closing the application, we would need to clean up the states…while this might be doable…it’s a lot of work, and what happens if the application crashes? Would we need to go and reset the states manually? Not ideal…
  • That leads me to the point of using the elevator…you can open the doors or sent it to any of the floors…for the main part…that’s easy…each floor is a scene, so you need to be on the first floor in order to make the elevator to floor two or three…but…what if you’re outside the elevator? You are on floor one…ask for floor three…and then you open the door…as your characters moves along with the elevator floor…when you open the door everything will look bad…solution? Simply using a cube without a mesh renderer, so it’s invisible…assign a collider with “is trigger” enabled…and validate that the player is colliding with the cube in order to make the elevator move…that way, even you ask for floor three and Alexa confirms that the elevator is going up…nothing will happen…when you open the door…we can assume that the elevator went down or up to your floor in order for you to hop in…just an illusion…but it works…
  • Alexa doesn’t have an option to delay the re-prompt, so when exploring the Virtual House she will ask you “What else can I do for you?” and if we don’t respond the skill will just die…so we will need to wake her up again…that’s kind of sad due to the nature of the application…but nothing to be done unless Amazon releases a way on making the re-prompt to wait longer…
  • As the whole Alexa-Unity3D relays on Heroku…expect some downtimes or responses from Alexa that are not actually replicated in the virtual world…might be an internet connection glitch or just Heroku glitch…

As I mentioned first…the environment gets affected by the weather…if it’s sunny…you will see a sunny clear sky…if it’s rainy you will see a dark and gloomy sky…and this involves using a Skybox…although not your regular Skybox…and what is a Skybox, anyway? Well…simply put…is a cube that covers your whole environment and has different images to simulate the environment…the problems is that the regular Skybox only allows you to assign six sides…which is of course not likely…you need to use a twelve side Skybox…then you can assign sunny image and also cloudy images…that way when checking the weather you can modify the luminosity and that will also affect the Skybox as it will use one or the other giving that nice effect on reflecting the outside weather…
Greetings,

Blag.
Development Culture.

martes, 14 de febrero de 2017

LED is my new Hello World - Prolog time

As promised...here's my LED Numbers app written in Prolog...it took me a long time...a lot of research...a lot of headaches...so I hope you like it...still a complete Prolog newbie...so...no warranties at all -;)

LEDNumbers.pl

number(0,[[' _  '],['| | '],['|_| ']]).
number(1,[['  '],['| '],['| ']]).
number(2,[[' _  '],[' _| '],['|_  ']]).
number(3,[['_  '],['_| '],['_| ']]).
number(4,[['    '],['|_| '],['  | ']]).
number(5,[[' _  '],['|_  '],[' _| ']]).
number(6,[[' _  '],['|_  '],['|_| ']]).
number(7,[['_   '],[' |  '],[' |  ']]).
number(8,[[' _  '],['|_| '],['|_| ']]).
number(9,[[' _  '],['|_| '],[' _| ']]).

digits(0,[]).
digits(X,[H|T]) :- (X/10 > 0 -> H1 is floor(X/10), H is X mod 10, digits(H1,T)), !.

accRev([],A,A).
accRev([H|T],A,R) :- accRev(T,[H|A],R). 

getDigits(L,R) :- digits(L,Y), accRev(Y, [], R).

show_records([]).
show_records([A|B]) :-
  print_records(A), nl,
  show_records(B).  

print_records([]).
print_records([A|B]) :-
  format('~w',A), 
  print_records(B).

merge([L], L).
merge([H1,H2|T], R) :- maplist(append, H1, H2, H),
    merge([H|T], R), !.

listnum([],[]).
listnum([H1|T1],[R|Y]) :- number(H1,R), listnum(T1,Y).

led(X) :- getDigits(X,Y), listnum(Y,Z), merge(Z,R), show_records(R).

Wanna see it in action? Me too -;)


Back to learning -;)

Greetings,

Blag.
Development Culture.

My first post on Prolog

As always...I was looking for my next programming language to learn...and somehow...Prolog got in the way...

I had played with Logic Programming in the past by learning Mercury...but really...when it comes to logic...Prolog wins the pot...

Did you guys knew that the first Erlang compiler was built on Prolog? Me neither -:P

For learning...I'm using SWI-Prolog which seems to be the nicer and widely used...and I have to admit...it's pretty cool -;)


So...in a glance...Prolog reminds me of Mercury of course...but also Forth a little bit...and weirdly to   Haskell in the sense that recursion is a key component...

As happens many times when I'm learning a new programming language...I started off with my Fibonacci numbers application...so here it is...

fibonacci.pl
fibo(NUM,A,B,[H|T]) :- (NUM > 1 -> H is A + B, X is NUM - 1, 
                        (A =:= 0 -> fibo(X,H,B,T); fibo(X,H,A,T))).
fibo(_,_,_,[]).

fibonacci(NUM,R) :- fibo(NUM,0,1,X), !, append([0,1], X, R).

.pl extension? Yep...the same as Perl...but as you can see...it has anything to do with Perl at all -;)

Anyway...here's the output screen...


My LED Numbers applications is gladly ready and will come after this blog -;)

Greetings,

Blag.
Development Culture.

lunes, 5 de diciembre de 2016

LED is my new Hello World - Rust time

As I'm currently learning Rust, I need to publish my LED app again -;)

Please take in mind...that..."I'm learning Rust"...so my code might be buggy, long and not idiomatic...but...enough to showcase the language and allow me to learn more -;)

Here's the code...

led_numbers.rs
use std::io;
use std::collections::HashMap;

fn main(){
 let mut leds:HashMap<&str, &str> = HashMap::new();

 leds.insert("0", " _  ,| | ,|_| ");
 leds.insert("1", "  ,| ,| ");
 leds.insert("2", " _  , _| ,|_  ");
 leds.insert("3", "_  ,_| ,_| ");
 leds.insert("4", "    ,|_| ,  | "); 
 leds.insert("5", " _  ,|_  , _| ");
 leds.insert("6", " _  ,|_  ,|_| ");
 leds.insert("7", "_   , |  , |  ");
 leds.insert("8", " _  ,|_| ,|_| ");
 leds.insert("9", " _  ,|_| , _| ");

 println!("Enter a number : ");
 let mut input_text = String::new();
 io::stdin().read_line(&mut input_text)
            .expect("failed to read");

 let split = input_text.split("");
 let vec: Vec<&str> = split.collect();
 let count = &vec.len() - 2;
 
 for i in 0..3{
  for j in 0..count{
   match leds.get(&vec[j]){
    Some(led_line) => { 
     let line = led_line.split(",");
     let vec_line: Vec<&str> = line.collect();
     print!("{}",&vec_line[i]);
     },
    None => println!("")
   }
  }
  print!("");
 }
 println!("");
}

And here's the result...


Hope you like and if you can point me in more Rusty way of doing it...please let me know -:D

Greetings,

Blag.
Development Culture.

My first post on Rust

Again...I'm learning a new programming language...and this time is the turn for Rust.


Rust is very nice and have some really interesting features like ownership and borrowing...and the syntax really reminds me of OCaml...which is really cool as well...

Right now I'm reading the official documentation, that it's pretty well done...so of course I did my Fibonacci numbers app...

fibonacci.rs
use std::io;

fn fib(num: i64, a: i64, b:i64) -> String{
 let mut result: String = "".to_string();
 let sum: i64 = a + b;
 let sum_str: &str = &sum.to_string();
 let a_str: &str = &a.to_string();
 let b_str: &str = &b.to_string();
 if a > 0 && num > 1 {
  result = result + sum_str + " " + &fib((num - 1), (a + b), a);
 }else if a == 0{
  result = "".to_string() + a_str + " " + b_str + " " + 
           sum_str + " " + &fib((num - 1), (a + b), b); 
 }
 result
}

fn main(){
 println!("Enter a number : ");
 let mut input_num = String::new();
 io::stdin().read_line(&mut input_num)
            .expect("failed to read");

 let trimmed = input_num.trim();
    match trimmed.parse::() {
        Ok(i) => { let result: String = fib(i, 0, 1); print!("{}", result);}
        Err(..) => println!("Please enter an interger, not {}", trimmed)
    };
}

The code is a little bit long for my taste...but that might be simply because I haven't learned enough Rust...or because their ownership/borrowing system sacrifices length to add security...which is actually a pretty good thing...

Here's the result...



My LED Numbers app is ready of course...so it's coming right after this post -;)

Greetings,

Blag.
Development Culture.

jueves, 1 de diciembre de 2016

Unity3D and Alexa working together

This post was originally posted on Unity3D and Alexa working together.


Since a long time...I had the idea of making Unity3D and Alexa work together...however...other project kept me away for actually doing it...so...a couple of days ago...a conversation with a friend made me remember that I actually really wanted to do this...so I did :)

At first...I wasn't exactly sure how to do it...but then slowly the main idea came into my mind...what if Unity read a webservice that gets updated by Alexa? When the right command is parsed, then Unity will create the object and problems is solved...seems easy? Well...it actually is...



First things first...we need to create a small NodeJS webserver on Heroku...then...we need to install the Heroku Toolbelt...

Now...create a folder called node_alexa and inside create the following files...

package.json
{
  "dependencies": {
    "express": "4.13.3"
  },
  "engines": {
    "node": "0.12.7"
  }
}

procfile
web: node index.js
index.js
var express = require('express')
    ,app = express()
    ,last_value;

app.set('port', (process.env.PORT || 5000));

app.get('/', function (req, res) {
  if(req.query.command == ""){
 res.send("{ \"command\":\"" + last_value + "\"}");
  }else{
 if(req.query.command == "empty"){
  last_value = "";
  res.send("{}");
 }else{
  res.send("{ \"command\":\"" + req.query.command + "\"}");
  last_value = req.query.command;
 }
  }
})

app.listen(app.get('port'), function () {
  console.log("Node app is running on port', app.get('port')");
})

Once you have that...log into your Heroku Toolbelt and write the following...

Heroku Toolbelt
cd node_alexa
git init .
git add .
git commit -m "Init"
heroku apps:create "yourappname"
git push heroku master
heroku ps:scale = web 0
heroku ps:scale = web 1

Your webservice is ready to rock :) You should be able to find by going to "http://yourappname.herokuapp.com/"

Now...this simple NodeJS powered webservice will serve as a simple Echo server...meaning...whatever you type will be returned as a json response...of course...if you type "empty" then the response will be a empty json...so the main idea here is that we can keep the last entered value...if you pass a command it will be called again when you don't pass any commands at all...so by calling it once...we can cal it multiple times without disrupting its value...
Next in line...will be to create our Unity app...

Create a new app and call it "WebService" or something like that...project name doesn't matter too much...

If the Hierarchy window select "Main Camera" and change the "Tranform" details like this...


Now, create a new "3D Object" -> "Cube" and name it "Platform" with the following "Transform" details...


After that, we might need to create four wall that will go around the platform...so create 4 "3D Object" -> "Cube" and name them "Wall 1", "Wall 2", "Wall 3" and "Wall 4"...





When everything is ready, your workspace should look like this...


Go to the project tab and create a new folder called "plugins" and then create a new C# file called "SimpleJSON"...inside copy the source code from here...this will allow us to use SimpleJSON to parse the JSON...

Now...create another folder called "Script" and inside create a new C# file called "MetaCoding"...or whatever you like...



MetaCoding.cs
using UnityEngine;
using System.Collections;
using System.Net;
using System.IO;
using SimpleJSON;

public class MetaCoding : MonoBehaviour {

    int counter = 1;

    IEnumerator DownloadWebService()
    {
        while (true) { 
            WWW w = new WWW("http://yourapp.herokuapp.com/?command");
            yield return w;

            print("Waiting for webservice\n");

            yield return new WaitForSeconds(1f);

            print("Received webservice\n");
        
            ExtractCommand(w.text);

            print("Extracted information");

            WWW y = new WWW("http://yourapp.herokuapp.com/?command=empty");
            yield return y;

            print("Cleaned webservice");

            yield return new WaitForSeconds(5);
        }
    }

    void ExtractCommand(string json)
    {
        var jsonstring = JSON.Parse(json);
        string command = jsonstring["command"];
        print(command);
        if (command == null) { return;  }
        string[] commands_array = command.Split(" "[0]);
        if(commands_array.Length < 3)
        {
            return;
        }
        if (commands_array[0] == "create")
        {
            CreateObject(commands_array[1], commands_array[2]);
        }
    }

    void CreateObject(string color, string shape)
    {

        string name = "NewObject_" + counter;
        counter += 1;
        GameObject NewObject = new GameObject(name);

        switch (shape)
        {
            case "cube":
                NewObject = GameObject.CreatePrimitive(PrimitiveType.Cube);
                break;
            case "sphere":
                NewObject = GameObject.CreatePrimitive(PrimitiveType.Sphere);
                break;
            case "cylinder":
                NewObject = GameObject.CreatePrimitive(PrimitiveType.Cylinder);
                break;
            case "capsule":
                NewObject = GameObject.CreatePrimitive(PrimitiveType.Capsule);
                break;
        }
        NewObject.transform.position = new Vector3(0, 5, 0);
        NewObject.AddComponent();
        switch (color)
        {
            case "red":
                NewObject.GetComponent().material.color = Color.red;
                break;
            case "yellow":
                NewObject.GetComponent().material.color = Color.yellow;
                break;
            case "green":
                NewObject.GetComponent().material.color = Color.green;
                break;
            case "blue":
                NewObject.GetComponent().material.color = Color.blue;
                break;
            case "black":
                NewObject.GetComponent().material.color = Color.black;
                break;
            case "white":
                NewObject.GetComponent().material.color = Color.white;
                break;
        }
    }

        // Use this for initialization
    void Start () {
        print("Started webservice import...\n");

        StartCoroutine(DownloadWebService());
    }
 
 // Update is called once per frame
 void Update () {
 
 }
}

Once you have the code...simply attach the script to the Main Camera...


The basic concept for this script is pretty simple...We're creating "DownloadWebService" as an IEnumerator method so we can call it as a Coroutine...and that allow us to have a sleep as we want to give some time between calls...

This method will fetch our Heroku WebService looking for a "create" command...once it has it...it will parse the JSON response and split in 3...so we can have..."create", "blue" and "sphere"...this will call CreateObject which will then create a blue sphere...after we have done that...the coroutine will continue as simply send a new command to our WebService to clean the output...to make this work nicely...we want to give 5 seconds after we clean the webservice before trying to see if there's another "create" call...

And this call be will be done by our Alexa skill...so basically when saying "create blue sphere" on Alexa...she will be send the command to the WebService...update the message and our Unity app will grab it...do its work...and clean up the Webservice...the wait for Alexa to provide the next command...
So...to kind of wrap up...we need to create our Alexa skill...

First, we're going to create a Lambda function...so log in here...

Of course...I have everything already setup...so I'm going to create a dummy function just to show the steps...

Click on "Create Lambda Function" and you will be presented with this...


There's a bunch of course...so type in "Color" in the filter box...


Choose "alexa-skills-kit-color-expert"


Leave this as it is and press "Next"


Choose a name and a description...


Choose an existing role if you have it already...otherwise just create a lambda_basic_execution...then raise up Timeout to 10 seconds and leave everything else as it is...press "Next"...a confirmation window will appear...so just press "Create function"...

You will be presented with a screen where you can upload your source code (which will be doing later on) and an ARN number...which we need for the next step...


The following part deals with create the Alexa skill...so please follow along...and log in here...


Choose "Alexa Skills Kit"...and create a new skill...



Choose a name for you skill and the most important...choose an "Invocation Name"...which is what you're going to use tell Alexa to open you application...something like..."Alexa, open Sandbox"...click next...

On the Interaction Model tab we have two windows...fill this on "Intent Schema"...

Intent Schema
{
  "intents": [
    {
      "intent": "GetUnityIntent",
      "slots": [
        {
          "name": "color",
          "type": "LITERAL"         
        },
        {
          "name": "shape",
          "type": "LITERAL"
        }
      ]
    },
    {
      "intent": "HelpIntent",
      "slots": []
    }
  ]
}

This are basically the parameters that we can use when asking Alexa to do something...

And fill this on "Sample Utterances"...

Sample Utterances
GetUnityIntent create {red|color} {sphere|shape}
GetUnityIntent create {yellow|color} {sphere|shape}
GetUnityIntent create {green|color} {sphere|shape}
GetUnityIntent create {blue|color} {sphere|shape}
GetUnityIntent create {black|color} {sphere|shape}
GetUnityIntent create {white|color} {sphere|shape}

GetUnityIntent create {red|color} {cube|shape}
GetUnityIntent create {yellow|color} {cube|shape}
GetUnityIntent create {green|color} {cube|shape}
GetUnityIntent create {blue|color} {cube|shape}
GetUnityIntent create {black|color} {cube|shape}
GetUnityIntent create {white|color} {cube|shape}

GetUnityIntent create {red|color} {cylinder|shape}
GetUnityIntent create {yellow|color} {cylinder|shape}
GetUnityIntent create {green|color} {cylinder|shape}
GetUnityIntent create {blue|color} {cylinder|shape}
GetUnityIntent create {black|color} {cylinder|shape}
GetUnityIntent create {white|color} {cylinder|shape}

GetUnityIntent create {red|color} {capsule|shape}
GetUnityIntent create {yellow|color} {capsule|shape}
GetUnityIntent create {green|color} {capsule|shape}
GetUnityIntent create {blue|color} {capsule|shape}
GetUnityIntent create {black|color} {capsule|shape}
GetUnityIntent create {white|color} {capsule|shape}

GetUnityIntent {thank you|color}

This are all the commands that Alexa can understand...and yes...we could have used "Custom Slot Types" to make the code shorter...but...I have had the problems of not working pretty well with more than one slot...simply hit next...


Here, choose AWS Lambda ARN...and pick either North America or Europe depending on your physical location...the on the text box...simply copy and paste the ARN that you received from your Lambda function...

This will send you to the "Test" tab...but we don't want to and actually we can't use that yet...so go back to the "Skill Information" tab and you will find that a new field has appeared...

And that should be "Application Id"...copy this number and let's move on to the final step...

Create a folder called "Unity" and inside a folder called "src"...inside that folder copy this file "AlexaSkills.js"

We're going to use the "request" module of NodeJS...so install it locally on the Unity folder like this...

sudo npm install --prefix=~/Unity/src request 

This will create a node_module folder with the request module on it...

Then, create a new file called "index.js"


index.js
var request = require("request")
  , AlexaSkill = require('./AlexaSkill')
    , APP_ID     = 'yourappid';

var error = function (err, response, body) {
    console.log('ERROR [%s]', err);
};

var getJsonFromUnity = function(color, shape, callback){

var command = "create " + color + " " + shape;

if(color == "thank you"){
 callback("thank you");
}
else{
var options = { method: 'GET',
  url: 'http://yourapp.herokuapp.com/',
  qs: { command: command },
  headers: 
   { 'postman-token': '230914f7-c478-4f13-32fd-e6593d8db4d1',
     'cache-control': 'no-cache' } };

var error_log = "";

request(options, function (error, response, body) {
 if (!error) {
  error_log = color + " " + shape;
 }else{
  error_log = "There was a mistake";
 }
  callback(error_log);
    });
}
}

var handleUnityRequest = function(intent, session, response){
  getJsonFromUnity(intent.slots.color.value,intent.slots.shape.value, function(data){
 if(data != "thank you"){
 var text = 'The ' + data + ' has been created';
 var reprompt = 'Which shape would you like?';
    response.ask(text, reprompt);
 }else{
  response.tell("You're welcome");
 }
  });
};

var Unity = function(){
  AlexaSkill.call(this, APP_ID);
};

Unity.prototype = Object.create(AlexaSkill.prototype);
Unity.prototype.constructor = Unity;

Unity.prototype.eventHandlers.onSessionStarted = function(sessionStartedRequest, session){
  console.log("onSessionStarted requestId: " + sessionStartedRequest.requestId
      + ", sessionId: " + session.sessionId);
};

Unity.prototype.eventHandlers.onLaunch = function(launchRequest, session, response){
  // This is when they launch the skill but don't specify what they want.

  var output = 'Welcome to Unity. Create any color shape by saying create and providing a color and a shape';

  var reprompt = 'Which shape would you like?';

  response.ask(output, reprompt);

  console.log("onLaunch requestId: " + launchRequest.requestId
      + ", sessionId: " + session.sessionId);
};

Unity.prototype.intentHandlers = {
  GetUnityIntent: function(intent, session, response){
    handleUnityRequest(intent, session, response);
  },

  HelpIntent: function(intent, session, response){
    var speechOutput = 'Create a new colored shape. Which shape would you like?';
    response.ask(speechOutput);
  }
};

exports.handler = function(event, context) {
    var skill = new Unity();
    skill.execute(event, context);
};

This code is very simple...because it mostly a template...you simply copy it...change a couple of things and you're ready to go...

Basically when you say "Alexa, open Unity"...she will listen for your requests...so you can say "create green cube"...so will call our Heroku WebService and the wait for another command...if you doesn't speak to her again...she will prompt you to say something...if you say "Thank you" she will politely deactivate herself...

And that's pretty much it...once Alexa send the command to the WebServer...our Unity App will read and act accordingly...creating whatever shape and color you requested...nice, huh?

But of course...you don't believe, don't you? It can't be that simple...well...yes and no...it's simple...but I took all the pain point and provide you with the nice and clean set of instructions...

So...here's how it looks like when you run the Unity app...



And here the action video...


Hope you like it...and stay tuned...because for me this was only a proof of concept...the real thing will become my next full time project...

Greetings,

Blag.
Development Culture.

LED is my new Hello World - Swift (for Linux) time

It took me some time to write this post...mainly because I'm now learning Rust and also because I just finished my latest demo...whose blog is coming later today -;)

This version of my LED Numbers app becomes the 25th language version...so...obviously...it's a pretty nice milestone for me -:D Who knows? Maybe I will do something nice if I can ever reach 50 languages -:D

Anyway...like I love to say..."Enough talk...show me the source code" -;)

LedNumbers.swift
let leds: [Character:String] = [
 "0" : " _  ,| | ,|_| ",
 "1" : "  ,| ,| ",
 "2" : " _  , _| ,|_  ",
 "3" : "_  ,_| ,_| ",
 "4" : "    ,|_| ,  | ",
 "5" : " _  ,|_  , _| ",
 "6" : " _  ,|_  ,|_| ",
 "7" : "_   , |  , |  ",
 "8" : " _  ,|_| ,|_| ",
 "9" : " _  ,|_| , _| "
];

print("Enter a number: ",terminator:"");
let num = readLine(strippingNewline: true);

var line = [String]();
var led = "";

for i in 0...2{
 for character in num!.characters{
  line = String(leds[character]!)!.
                       characters.split(separator: ",").map(String.init);
  print(line[i], terminator:"");
 }
 print("");
}

And here's the picture of it working its magic -:)


Greetings,

Blag.
Development Culture.

lunes, 28 de noviembre de 2016

My first post on Swift (for Linux)


As Apple kindly released Swift for Linux...I had to learn about it -:)

Of course...it's not fully implemented...so most of the things that makes Swift awesome on IOS are not here yet...but still...it's awesome! -:D

Swift is kind of functional...so you can see a lot from Haskell and Erlang...but it's also imperative and Object Oriented...so that makes it a really interesting language...

As usual...here's my Fibonacci numbers little app...

fibonacci.swift
func fib(num:Int,a:Int,b:Int) -> String{
 var result: String = "";
 if a > 0 && num > 1{
  result = result + String(a + b) + " " + 
           fib(num: (num - 1), a: (a + b), b: a);
 }else if a == 0{
  result = String(a) + " " + String(b) + " " + 
           String(a + b) + " " + 
           fib(num: (num - 1), a: (a + b), b: b);
 }
 return result;
}

print("Enter a number: ",terminator:"");
let number = Int(readLine(strippingNewline: true)!);

print(fib(num: number!, a: 0, b: 1));

And here's the result....


I already have the LED Numbers app ready...so just wait for it -;)

Greetings,

Blag.
Development Culture.

martes, 15 de noviembre de 2016

LED is my new Hello World - Perl Time

As promised...here's my LED Numbers a la Perl...and as always...please keep in mind that I'm Perl newbie...I know that there are more efficient, short and concise way of doing this app...but...how good is an introductory code that uses some obscure and arcane code? I don't want to scare people away from Perl...I want people to say "Hey...that doesn't look hard...I want to learn Perl"...

So...here it is...

LedNumbers.pl
#!/usr/bin/perl
use strict;
use warnings;
use diagnostics;

my %leds = (
 0 => ' _  ,| | ,|_| ',
 1 => '  ,| ,| ',
 2 => ' _  , _| ,|_  ',
 3 => '_  ,_| ,_| ',
 4 => '    ,|_| ,  | ',
 5 => ' _  ,|_  , _| ',
 6 => ' _  ,|_  ,|_| ',
 7 => '_   , |  , |  ',
 8 => ' _  ,|_| ,|_| ',
 9 => ' _  ,|_| , _| '
);

print "Enter a number: ";
my $num = <>;
my @numbers = ( $num =~ /\d/g );

for my $i (0 .. 2){
 for my $j (0 .. scalar(@numbers) - 1){
  my @line = split /\,/,$leds{$numbers[$j]};
  print $line[$i];
 }
 print "\n";
}

And here's the output...


And just so you know...this is my 24th version of this code...yep...I have written my LED Numbers app in 24 languages so far -;) What's going to be my end point? Who knows...programming is the limit -;)

Greetings,

Blag.
Development Culture.

My first post on Perl



So yes...I started to learn Perl...why? 3 simple reasons...


  1. I love programming.
  2. For me...Perl belongs to the whole trinity of Scripting Languages along with Ruby and Python (Sorry PHP...you don't make the cut)
  3. Because...it's Perl! Come on!

So...I have been reading Beginning Perl...an awesome book by the way...



If you're using any flavor of Linux or Mac...you should have Perl installed already...if you're using Windows...well...you can always download it and install it -:)

So far...I love Perl...it's pretty amazing...and now I can see why people say that both Python and Ruby heavily borrow stuff from Perl...and sure...PHP too...

I don't have of course much experience...but as always...I start doing a simple and small program to get me into the right track...so here's my Fibonacci numbers app...

fibonacci.pl
#!/usr/bin/perl
use strict;
use warnings;
use diagnostics;

sub fib {
 my ($num,$a,$b) = @_;
 my $result = '';
 if ($a>0 && $num>1){
  $result = $result . ($a+$b) . " " . fib($num-1,$a+$b,$a)
 }elsif($a == 0){
  $result = $a . " " . $b . " " . ($a+$b) . " " . fib($num-1,$a+$b,$b)
 }
 return $result
}

print "Enter a number: ";
my $num = <>;

print(fib($num,0,1));

And here's the nice output...


By now...I already have my classic "LED Numbers" app ready...but that goes into another post -;)

Greetings,

Blag.
Development Culture.