Live analysis of Google I/O 2015 keynote

Live analysis below about Google’s I/O 2015 Keynote as it happened.


 

  • 20:47:09 – And that was all folks! The most exciting part to me was the Machine Learning, Deep Neural Networks and Natural Language Processing along with the Cardbord/VR stuff.
    Thank you all for attending and hopefully we will know more details about the upcoming Android M version. 

  • 20:28:16 – Time to talk about inmersive Virtual Reality; Cardboard somehow managed to start the revolution.
    Phones got a lot bigger so now they will be releasing Cardboard 2.0, which will take you only 3 steps to build.
    The Cardboard SDK now will also support iOS, so you could slide your iPhone and enjoy the same as in Android.
    “Expeditions” is a new way to have students “virtually” travel to some destination. The teacher controls the experience and all the viewers are synchronized. Pretty nifty! 🙂
    “Jump”; enables any creator to capture the world into VR video and GoPRO will be building a Jump rig.
    Youtube will natively support VR 360 this summer… will have to get me one of the new Cardboard 2.0 to see how good this new VR is…

     

  • 20:16:58 – Time to talk about developers; Develop/Engage/Earn. Tools to build as quickly and reliably as possible with android studio, now version 1.3 … oh wow…C/C++ in the IDE…canary version available.
    Cloud Test Lab, to automate the testing of mobile apps in different environmet. Just have to upload the app and google will run it on 20 different environments and give you a report.
    Now you can index your app into google search.
    Cloud Messaging has been expanded to include iOS & Chrome. Users can also subscribe to topics notification.
    Universal App Campaign so you can market your app with a target budget.
     
  • 20:05:38 – Maps Offline; Finally it seems we will be able to save maps and also ALL CONTEXT/SEARCH data, and navigate online or offline…. FINALLY!

     

  • 19:48:58 – Google Photos; Home/Organize/Share are the three pillars ideas.
    Again, thanks to Machine Learning, your photos will get auto-organized without lifting a finger (well maybe a couple of taps lol).
    It looks like a nice supercharged photo/video app.
    Store all your photos and videos at native resolution (16Mpx top), unlimited for free, starting from today, by downloading the app for Android, IOS, WEB, nice, thank you!

     

  • 19:36:04 – GoogleNow on Android M; Pretty impressive NLP going on and they created a nice “context-aware” system on the phone that seems nicely integrate into the normal workflow and enhances the user experience… WELL DONE!

     

  • 19:32:02 – Context/Answers/Actions : This is how Google and their Machine Learning magic is working in the background to present useful information. Sounds pretty nice and sure got me excited with all the progress being made. In the end once you have reliable Context + Answers, you can create some amazing Actions!
    In the end, it’s not as much the hardware as the software that makes everything work.

     

  • 19:28:07 – Time to talk about machine learning, that enables google to do all the nifty things they are currently doing. They are using Deep Neural Nets (this is quite bleeding edge and what I am trying to study)…30 layers deep… impressive…So they have a 92% accuracy.
    Talking about app/data context, which I think is very important and the way to go as I want my phone to help me in my day to day life while being in the background instead of wasting my time.

     

  • 19:22:00 – Project Brillo; the OS for the “Internet of Things”, is derived from Android, but I imagine a very light and watered down shell… Some hardware of IoT is not as powerfull, so you would need a very lightweight kernel/os.
    “Weave” is a new take on simplifying communication protocol between the different devices. Now you can talk to your oven, lol xD

     

  • 19:15:17 – Talking now about android watch; they take a shot across to the iWatch camp “you don’t need your phone”, lol 🙂
    Very nice that latest release of Android Wear includes some nice add-ons but what I care most about is performance, user experience and battery life. The gestures option is nice when you have your hands tied up, say you have a latte in your right hand and need to check something on your watch.
    Some stuff looks nice but we will see how it will behave in real life.
    I just heard some golf lovers jumping of excitement lol!:-D
    The take of what I’m seeing is that they have tightened the integration between Android wear, phone and apps. In the end, it’s all about “choice”.
     
  • 19:07:02 – “Direct Share”, “USB-C”; very logical but they do resemble IOS style & logic from the video I’m seeing right now… hummm….

     

  • 19:05:17 – Aha!, time for some performance enhancements! New system called “Doze” to control background activity depending on motion sensors, so if you are sleeping and the phone is on table, the phone will “Doze”, so I imagine is kind of an aggressive grouping of background tasks that will fire a lot less… Finally some control and order in the chaos of background tasks….ABOUT TIME GUYS!!!

     

  • 19:02:57 – Fingerprint API baked into the OS and also available into all authentication providers and apps… well nothing new under the sun here…

     

  • 19:01:25 – Now it’s the turn to promote “Android Pay”; apart from the usual, it seems it will also be baked into apps, so you would be able to use it to pay in specific apps, instead of having to enter your card details.
    And of course, it will include support for the fingerprint scanner.

     

  • 18:59:26 – Onto “App Links”; Enhanced intent platform, managed by each app that will be able to seemlessly open authenticated links into the apps, such as a twitter link in an email, it would open in the twitter app, authenticated. Seems nice but will have to check it out in depth…

     

  • 18:56:43 – Next is “Web Experience”; Chrome Custom Tabs, seems like it’s chrome baked into every app that requests it, so you don’t actually abandon the app. All the benefits of chrome gets baked into all apps that are using it. Rolling in Q3…

     

  • 18:53:57 – First thing on the table is “App Permissions”; It’s about time to have this basic functionality baked into the system instead of using “hacks” or “add-ons” that require rooting.
    The thing is that what I am seeing looks awfully like the IOS implementation…?

     

  • 18:52:02 – Time to talk about Android M; It seems this year everyone is focusing on quality and performance… I say it’s about time to take care of the “Core User Experience”.

     

  • 18:49:43 – So HBOnow is coming to Android.

     

  • 18:42:46 – Sundar Pichai takes the stage after an interesting intro. More than 2 million live streaming, and I’m one lol :p

     

  • 18:37:36 – Inmersive galaxy with the many planets of our solar system… I wonder what’s the message there…and we see our beloved blue planet!
     
  • 18:32:08 – The stage reminds me a bit of the Samsung’s S6 unveiling; lots of screens hahahaha playing ping pong :-D…

     

  • 18:30:35 – AND WE ARE LIVE!!!!!

PS: I dig the “IO” themed graphical countdown clock they set up.

 

You may also like...

%d bloggers like this: