August Kaggle Dataset Publishing Awards Winners’ Interview

August Kaggle Dataset Publishing Awards Winners' Interview

In August, over 350 new datasets were published on Kaggle, in part sparked by our $10,000 Datasets Publishing Award. This interview delves into the stories and background of August’s three winners–Ugo CupcicSudalai Rajkumar, and Colin Morris. They answer questions about what stirred them to create their winning datasets and kernel ideas they’d love to see other Kagglers explore.

You can find my interview here.

TV interview

I had the chance to be interviewed about my work at Shadow Robot by a French TV channel. This was made possible by working at a coworking space, yet another advantage of being a remotie!

Head over here to see the interview (you can skip to 13’30)

August Kaggle Dataset Publishing Awards Winners!

First Place, Grasping Dataset by Ugo Cupcic

Our team selected this dataset because it combines two exciting fields of research: robotics and deep learning. To learn more about the dataset, we loved Ugo’s excellent blog post “How I taught my robot to realize how bad it was at holding things” complementing the usage of Kaggle to make such a unique research-based dataset open and accessible to all.

You can find the announcement over here.

We built an open sandbox for training robotic hands to grasp things

We built an open sandbox for training robotic hands to grasp things

Getting started with robotics is probably a lot easier than you think. Here’s a simulation sandbox that’s cross-platform and provides a simple high-level API. It should help you get started experimenting with robot grasping tasks.

As the Chief Technical Architect at the Shadow Robot Company, I spend a lot of time playing with different algorithms to see how they’d fit our robots. Controlling a complex robot to make it behave the way you’d want in a complex environment is… complex!

We built an open sandbox for training robotic hands to grasp things

What it’s like to be a Robot in 2017

What it's like to be a robot in 2017

What will a state-of-the-art robot be able to do in 2017?

There are many different types of robots out there, from humanoid robots to industrial arms that can move with an amazing accuracy and speed.

Given my area of expertise, I’ll focus more on grasping and using objects. These are core human skills that robots need to acquire if you want them to be truly useful.

But why’s it so hard for a robot to replicate these skills?

What it’s like to be a Robot in 2017 – freeCodeCamp

Two easy steps to a powerful robot arm interface

Two easy steps to a powerful robot arm interface

A robot Hand without a robot Arm is most of the time useless. At Shadow we have a long history of interfacing different robot arms with our software and hardware. In the different projects we’ve run over the years, we’ve written software for arms from Universal Robot, Denso, Kuka, Staubli… We’ve also developed a few intriguing arms internally, from an arm actuated by air muscles to a lightweight arm that picks-up strawberries.

On this journey, we’ve learned a few things. Let me share a few tips on what it takes to write a good interface for a robot arm quickly.

Two easy steps to a powerful robot arm interface – Ugo Cupcic – Medium

How we beat the state of the art in a week 

How we beat the state of the art in a week

At Shadow, we’re focusing on making complex robots intuitive to use. For that, we need very good path planning. There are plenty of amazing solutions out there, but we were recently faced with a project where those state of the art solutions just weren’t good enough for us. We needed a super fast planner that generated trajectories that “looked good”.

How we beat the state of the art in a week – Ugo Cupcic – Medium

Two conferences, ten days, hundreds of robots

Two conferences, ten days, hundreds of robots

An important part of my job is to stay on top of what’s currently happening in robotics. Given the fast pace in robotics, it sometimes feels more like trying not to drown! A great way to see all the latest trends is to attend a few key conferences. IROS and ICRA are the two biggest robotics conferences in the world and I often go to those. I attend a few other smaller more focused conferences to study precise subjects – such as control theory at the great DLMC conference in Zurich.

Two conferences, ten days, hundreds of robots – Ugo Cupcic – Medium

How to tell if my robot’s grasp is stable

How to tell if my robot's grasp is stable

An important part of our roadmap is focusing on making grasping trivial for the end user. We want to be able to point our robot at an object with the instruction of grasp it. Although from a human point of view it sounds trivial, this is actually complicated for a robot. A crucial step in that direction is to be able to quantify how well the robot is grasping the object; without that measurement, the robot will never be able to improve. In this post, we’ll focus on different methods used to assess grasp quality.

How to tell if my robot’s grasp is stable – Ugo Cupcic – Medium