Mattias Rost

Researcher and Coder

Why and How to Quickly Build Apps - Make No Decisions

Posted on 2015-03-10

Today I was invited to give a talk at the Mobile Apps Group Meetup in Glasgow. I decided to talk briefly about my own app development in my research, why it involves quickly building apps, and how I tend to do that.

I first gave the premise of my work: To understand an app (or the ideas manifested in the app), it needs to be built, so that it can be studied in use. In my view, we can only know what an artefact is once it is in use. We cannot know what it is prior to that.

I then explained how I suggest people to do that. None of the points are anything new, but it is hopefully something that people will start doing once they hear it often enough so I figured it is worth talking about. Concisely I'm trying to convey that you should make decisions when they are easy to make and refrain from it when it is time consuming.

Thus the process is:

  • Sketch a lot of ideas. Sketch on paper or in any other material that is easy to produce and easy to discard.
  • Make a mockup using Sketch, Photoshop, or anything else that allow you to create what you want different screens of your app should look like. This is where decisions are made. From the sketches made previously, pick one, make it into images that will say exactly what the app will look like on screen. The purpose of this mockup is to describe what is to implemented in the next phase.
  • Now you build. But make no decisions what so ever. Just transfer the decisions manifested in the mockups, down to the font sizes, margins, and colours. As soon as you start playing around with margins, font sizes, and colours, you start loosing a lot of time. The reason is because it is not as easy and efficient as it would have been if you would have done this already when creating the mockups, so you should not do it now! If it makes it easier, pretend that the mockups come from a paying customer who pays you to implement an app that looks exactly like he has decided and you have no room for creative suggestions.

In my experience, following this simple rule of making all decisions while creating the mockups, and making no decisions when implementing, makes implementing it a breeze. I think one of the reasons for this is because when you try to make decisions in the implementation phase you not only need to think about how you would like it to look, but you also need to think about how to make it so. Having the decision made means you only need to think about how to make it.

Valentine's IoT mirror

Posted on 2015-02-14

I have for the longest time wanted to get into electronics. While I have been tinkering with connecting lights and motors to batteries since a very low age, taken several electronics courses both in highschool and at university, I've never done anything practical. Recently however I started using my now-of-age reserves of money towards DIY-electronic stuff in order to try and finally do something. In the last couple of weeks I've learned how easy it really is once you actually have the components you need, and how the DIY culture, especially around Arduino, have made these things incredibly accessible. It's safe to say that we've come a long way since the days of holding a small toy lightbulb against a 4.5V battery (which I did as a kid).

Since when I get into something I tend to immerse myself quite deeply. Therefore, for this valentines day I quite readily jumped to the idea of building something with my newfound skills and toys. I therefore decided to connect a few LEDs in the shape of a heart that could flash in interesting ways to give to my girlfriend. As you can see in the video, it eventually turned into a mirror with hidden heart-shaped LEDs that can be controlled through a web interface on the local network.

I had at this point built up a sensor network at home using nRF24L01+ for wireless communication (which is an incredibly cheap but easy to use RF-module), and both Arduino Pro Minis as well as ATtiny85s as brains. Therefore I already had one of the radio modules connected to an Arduino communicating serially with a small server in the house. All I had to do was to figure out how I could control and drive a bunch of LEDs (ended up using 14) from the Arduino, and then try and package it nicely to be a suitable gift.

The mirror is built using an Arduino Pro Mini 5V, a nRF24L01+, driven using a 12V power adapter. Pretty sure 12V is overkill, but it's what I had lying around. The nRF24L01+ must be run on 3.3V so it's also using a voltage regulator that I put on the control board for the LEDs. The LED control is two 547 transistors and the LEDs are current limited using 470ohm. I figured that I couldn't run the current for all 14 LEDs through one transistor (they would consume about 150mA, and I didn't have any other transistors lying around) and therefore split them over two. Also ended up connecting the base of the transistors to two different pins on the Arduino, and could theoretically control the upper and lower part of the heart individually. Just ended up not doing that.

The mirror is continuously listening to radio traffic. When a control message arrives, it changes the mode of the LEDs. The software on the mirror is therefore incredibly simple. The server however had to be slightly modified. To this point the server was only listening to sensor nodes, and not transmitting any information back. I therefore had to change both the Arduino program to listen to serial communication and forward the data to the radio, and the software running on the server. The current server software was in Python, but decided to change to NodeJS. Since I also wanted to be able to control from a web interface, I took the time to also build a simple rest-api for sending the command to the mirror using HTTP. Thus the server software now listens to the serial for sensor readings from my sensors in the house and stores that in a database, as well as receives HTTP requests for sending out data packets to listening nodes.

Finally I created a super simple web page with four buttons for selecting one of the four settings on the mirror: heart-beat, off, on, and smooth pulse.

It was a fun project, girlfriend was happy, and I can now move on to the next projects. The project gave me experience in designing and soldering a PCB (using vero board), cutting such board using a drill, how simple it can be to put things together in not too shabby ways.

Lived Informatics at CHI, Toronto

Posted on 2014-04-30

Today I presented our paper 'Personal Tracking as Lived Informatics' at CHI. By interviewing 22 people about how they were using personal tracking devices, such as Nike Fuelband, Jawbone Up, FitBit, and mobile apps (e.g. RunKeeper and MyFitnessPal).

Among the findings, we uncover how the people we talk to use multiple trackers, and track multiple things. They switch between trackers, as well as what they track, over time. While they say that they do not share tracked data to social networks, they do track together with people in their lives. Furthermore, while they track for a long time, they rarely look at their historical data.

We discuss our findings and present an alternative view to Personal Informatics which we term Lived Informatics. Lived Informatics emphasises the emotionality of tracking and that tracking is something done with an outlook for the future, rather than part of a rational scientific process for optimising self as seen in Quantified Self and in Personal Informatics.

We are very happy the paper got an Honorable Mention.

The paper:

Rooksby, J., Rost, M., Morrison, A. and Chalmers, M. (2014) Personal Tracking as Lived Informatics. To appear in Proceedings of CHI’14, Toronto, Canada.

Guest lecture in Gothenburg

Posted on 2014-04-07

On friday I gave a guest lecture in a course at LinCS at the University of Gothenburg. The course was 'Understanding and designing for social media practices' and is aimed at exploring methods for studying online social media.

The lecture was based on two previous studies of foursquare. The first one was a primarily interview based study where we looked at performative aspects of using foursquare. The second one deals with a study of log data from foursquare. The talk discussed what it means for people to share their location through the means of checking in (or what it means to check in), and discussed what this means for the genera of data, and showed examples of communicative features in the log data from foursquare.

The slides can be found on slideshare.

Texture problems in Android OpenGL

Posted on 2013-11-22

Encountered such a stupid thing that I do not wish anyone else ever to encounter so I feel I have to write a post about it. Tl;dr: BitmapFactory.decodeResource may rescale the image to match the pixel density of the device, which is not good if you are concerned about the pixel size of the image.

I recently had a problem with textures in OpenGL ES 2.0 on Android. Everything worked like a charm on the development device I was using, and everyone was happy. But when I found a less common device in the office, I decided to try it on that device to see the performance. I did not expect it not to work, but was prepared that might be some performance issues. Little did I expect that nothing would show on the screen. Also, no proper error messages were shown anything. Just black screen.

When loading textures in OpenGL on android, people generally use something like

[code]
Bitmap bitmap = BitmapFactory.decodeResource(context.getResources(),R.drawable.cool_texture);
GLUtils.texImage2D(GLES20.GL_TEXTURE_2D, 0, bitmap, 0);
[/code]

Its doubtful that anything would go wrong with these two lines, so I went to check the most obvious things I could think of. First off I checked if the device required power of two sized textures (256x256, 512x512, etc). I created a texture that was 256x256 and loaded it. Did not work. So I went ahead and tried all kinds of things, stripped down the code to bare minimum to isolate where the problem was.

To cut things short. The problem was with the two lines, and the way BitmapFactory.decodeResources decodes resources... The bitmap I obtained ended up not being the dimensions of the image I created, but something slightly less. The fix:

[code]
final BitmapFactory.Options options = new BitmapFactory.Options();
options.inScaled = false;
Bitmap bitmap =
BitmapFactory.decodeResource(context.getResources(),R.drawable.cool_texture, options);
[/code]

Once this was added, I learned the problem indeed was with power of two sized textures, and could take care of it.

Ps. if you want to check the platform supports non-power of two sized textures, you can run this check:

[code]
String extensions = gl.glGetString(GL10.GL_EXTENSIONS);
mSupportsNPOT = extensions.indexOf("GL_OES_texture_npot") != -1;
[/code]

Potentially not the nicest way of doing it, but it works. Ds.