Thursday, March 14, 2013

ConFoo

Before I started speaking I always wonder how people know about the conferences. Turns out most information is still disseminated in the traditional way: word of mouth. And that is how I heard about CooFoo: from other speakers. Specifically, I heard about it when I was in Amsterdam for Dutch Mobile Conference. They told me ConFoo is a very well-run conference covering a variety of technologies, and they treat speakers very well. So I followed ConFoo on Twitter, waited for the CFP, and submitted three talks. Two of the got accepted, which brought me to Montreal two weeks ago.

Ten tracks

I was a bit overwhelmed when I first looked at the program. There are ten tracks going on at the same time! Lots and lots of different technologies, and I often wanted to go to two talks in the same slot.

When there were conflicts, I often picked based on speakers. I know both Sandi Metz and Katrina Owen are great speakers, so I made sure went to their talks. They have great content in their talks, but they also pay great attention to delivery, bringing us on a journey through out the talk.

I also tried to go to talks which will expand my horizons. Creative Machines by Joseph Wilk was a great example. I got to hear about AARON, a machine that was trained by a painter over 40 years. The results are quite amazing.


Sample work by AARON

Overview and deep dive

I gave two talks at ConFoo: Fluid Android Layouts and Mobile Caching Strategies. Mobile Caching Strategies is a high-level talk, no code at all, so I knew it would work well in a general conference like ConFoo.

Fluid Android Layouts is a deep dive, and I found it difficult to cater to the different levels of experience in the audience. To make sure everyone can follow along, I went over some Android basics at the beginning. But I also provided code samples so the Android developers in the crowd had something to take home.

Well-run conference

ConFoo covers so many topics, and yet manages to keep the environment very intimate. There was plenty of opportunities to mingle with everyone, including very nice sit down lunches everyday.


Nice lunch!

There were volunteers at each session to help the speakers set up, give timing signals, and collect feedback forms at the end. Overall everything is well-communicated, and the organizers make sure everyone is having a good time. Bravo!


The organizers

Sugar Shack

With three conference days and ten tracks, there were a lot of speakers. The organizers arranged quite a few activities for us, one of which is to visit a traditional sugar shack. It was an hour drive from Montreal, in an idyllic setting among the snow.

Sugar shack
Sugar shack

We sat down in a wooden cabin and ate everything with maple syrup: bread, soup, sausages, frittata, meat balls, pancakes, coffee, just everything. I got to hang out with many different speakers, and not surprisingly, I already knew a few of them through various conferences, notably Dutch Mobile Conference.

Nuit Blanche

I decided to stay an extra night to do some sightseeing. It happens to be Nuit Blanche, the winter celebration. There were activities all over the city. We started by exploring the underground passageways, which were running an art exhibition. My favorite is by Baillat Cardell & Fils, which consists of a box with a mirror and a few tubes in front of it. But it was cleverly arranged that it looks like you are gazing into a tunnel.

Ici
Ici

We came up from underground and went to a square sprinkled with glowing snowballs. They were translucent ping-pong balls stuffed with an LED light, and so much fun to play! People juggled them, buried them in snow, and of course, threw them at each other. Such a brilliant idea!

LED snowballs
LED snowballs

At 9 o'clock there were fireworks. Afterwards we mingled among the crowd and checked out the various food vendors. The most interested one had open fire pits. People bought sausages and marshmallows to roast in them.

Open fire pit
Open fire pit

There was so much going on I couldn't describe it all. We saw fire dances, light shows, projections, blacksmithing, glass blowing... the whole city was awake until the wee hours. We couldn't believe it when we got back to the hotel - it was already 2am!

Monday, February 4, 2013

Learning iOS as an Android developer

As a veteran Android developer, I have always been curious about the iOS platform. Is Objective-C hard to learn? Is it really much easier to make beautiful UI in iOS?

I decided that the best way is to write an app on both platforms and compare. An app that I actually launch, so I experience the whole process, from coding to UI design to distribution. The result is Heart Collage, available on both Apple App Store and Google Play.

Here are my thoughts after learning iOS for two months.

The setup

I wrote an universal app in iOS 6, with auto layout and storyboard. I chose iOS 6 for the new functionalities like UICollectionView and UIActivityViewController. Auto layout and storyboard trickled down from the iOS 6 decision. Since I was on the latest version, I may as well take advantage of the latest tools.

Learning curve

The first three weeks were painful. Not only I did not know anything, but I lacked the vocabulary to ask questions. I would search for something and find 5 Stack Overflow threads, all of which sounding kind of related to what I need, but not really. It was really frustrating.

But things changed on the third week. By then I knew which classes I was using, so I prefixed all my image manipulation searches with UIImage, navigation searches with UINavigationController. I also had some basic understanding of how things were organized, and was able to skim and judge if a particular thread was relevant.

Once I knew how to find answers on the internet, development speed really picked up. I felt like I was actually coding, instead of walking into a wall.

UI Editing

Initially I thought I would really be bothered by all the square brackets in Objective-C, but I got used to the syntax fairly quickly. What tripped me up was Interface Builder / Storyboard.

In both iOS and Android, there are two ways to specify layout: xml and code. The difference is that Android has readable xml. Not so much in iOS.

Both systems use unique ids to refer to the various components. In Android, you define the id like this:

<Button android:id=”@+id/start_button” />

The build system gathers all the id tags and generate unique ids in Java:

public static final class id {
  public static final int start_button=0x7f08003b;
}

To refer to a view in your code, use findViewById:

Button startButton = (Button) findViewById(R.id.start_button);

In iOS, the storyboard directly generates the unique ids in the xml:

<button id="vMl-QF-OAb" />

To refer to a view in your code, you first define an IBOutlet in your .h file, go to the storyboard, right-click drag your view into the view controller. This assumes you have already told storyboard that this particular window is linked to that view controller, otherwise the IBOutlet will not show up.

It all makes sense after the fact, but when I first started I would drag and drag and drag and not be able to link the views. Sometimes I forgot to specify the view controller. Other times I forgot to add the IBOutlet. On late nights I forgot it’s right-click drag, not just drag.

The most difficult part is that I cannot compare my implementation with sample code because it is all visual. In Android I would diff the whole project, code and xml and all, to find out what I missed. The XML produced by Interface Builder / Storyboard is not diff friendly at all.

Built-in Components

Once I got the ropes around UI editing, I can build the various screens for the app. People claim that the built-in components in iOS are much more beautiful than Android, but the gap has significantly narrowed since Ice Cream Sandwich. Sure, the iOS UIPickerView is still much more delightful to use than the Android Spinner, but the basic components like buttons are pretty much on par.

There was one thing that was much much easier to use on iOS than Android: the camera preview. Heart Collage shows a square camera preview for you to pose. In iOS, I can ask for a preview window in any aspect ratio, and the system crops the camera feed automatically. In Android? The system stretches the camera feed to the aspect ratio of the preview. To make a square camera preview I had to make the preview window the same aspect ratio as the camera feed, and cover up some parts so it appears to be a square. It was really involved. Who wants a distorted camera feed anyway? Cropping is the right thing to do.

For the rest I almost always find direct correspondences: ImageView maps to UIImageView, TextView maps to UILabel, ListView is roughly UITableView, and GridView... well GridView is interesting. Up until iOS 5 there is no built-in grid view. You have to use a UITableView and layout the cell on each row yourself. I was shocked when I heard that. Guess I’m spoiled by Android? We have that since version 1! Fortunately UICollectionView was introduced in iOS 6, and unlike Android, it is okay to target the latest OS release because most users upgrade very quickly.

This brings us to the famous fragmentation debate.

Fragmentation

There are two kinds of fragmentation: OS version and device form factor.

OS version

iOS is definitely better positioned against OS version fragmentation, since Apple is the sole manufacturer of all iOS devices and they completely control the OTA schedule.

Device form factors

Until recently the device form factor was pretty uniform. There is Retina and Non-Retina, that’s it. Different density, same aspect ratio. Same aspect ratio means you can still use a coordinate-based layout system and align your views in Interface Builder.

Everything was peachy until iPhone5. Suddenly there is a different aspect ratio, and Apple needed something more powerful than struts and springs. The solution is Auto Layout.

Auto layout is a declarative way to specify the positions of your views. Instead of saying, put this image 240 pixels from the top, you say, center vertically. The system computes the xy-coordinates based on your constraints, so it adapts well to different form factors.

Auto layout sounds good on paper, but it is really clunky to use in practice. In Interface Builder, you still drag and drop your views, and XCode tries to guess your intention. Most of the time it gets it wrong, so I have to remove the automatically generated constraints and create my own. I also tried doing it in code, but it is very verbose, and very easy to make mistakes. The visual format helps a bit, but most of the time I want to center my views, and there is no way to specify that in ASCII.

This is the time when I really really miss Android. The system was designed from day one to handle multiple form factors, and you are introduced to concepts like match_parent and wrap_content from the very beginning. I declare my layout in xml, spell out relationship among the the views with human-readable ids, and I can easily verify my rules whenever I need to add a view. In iOS I am always doubtful when I drop in a new view. What did it do to the existing views? It is so tedious to click through them one by one and examine all the constraints.

Perhaps there is a better way. But all my iOS developer friends started before iOS 6, before auto layout was available. They declare their views in code, compute the frames by hand, and basically run their own layout algorithms. And there is no reason to convert once you have a system in place, so I am on my own on the auto layout front.

Intents

Another thing I miss about Android is the intent system. Both for navigation and integration.

Navigation

For Heart Collage, I capture your poses with the camera, then replace the camera activity with the view collage activity, showing the mosaic. Here is what I do in Android:

Intent intent = new Intent(this, ViewCollageActivity.class);
startActivity(intent);
finish();

In other words, I add the view collage activity onto the activity stack, and remove the camera activity by calling finish().

It took me a very long time to figure out how to do that in iOS. In storyboard, most of the time you push a new view controller onto the stack by adding a segue to a button. You can also push a manual segue, which is what I do after the camera snaps all the photos. The tricky part is, how do I pop the old view controller? If I push first, the old view controller is no longer on top on the stack, so you cannot pop it. If I pop first, the old view controller is no longer on the stack, and I am not allowed to ask for a segue from it.

This is the moment when I doubt if it was wise to go with storyboard. It seems to be designed for very simple navigation needs, and even my 4-screen app is too complicated for it. I ended up popping one level higher with a flag to automatically forward me to view the collage. Bit of a hack, but I was too far deep into storyboard to back out and recreate all the views in xib. Especially since I have no way of copying and pasting the layouts, so I have to drag and drop everything again.

Integration

After you make a Heart Collage, the app lets you share it with your friends. This is super easy on Android. I just create an Intent saying that I want to share an image, and the system automatically generates the list of installed apps that can handle that. It’s an elegant way to have a personalized and extensible experience. Users can share their collages with any apps they prefer, and I don’t even need to know they use that app, let alone creating a new integration point.

For sharing, iOS 6 provides a similar functionality with UIActivityViewController. I set up the message and image, and it brings up a list of options for sharing. The big difference is the list is curated by Apple, and not extensible by the user. So everybody will see Sina Weibo as an option, whether they care about it or not.

This is where Android really shines, the seamless integration among apps, and as a result a very personalized experience.

Distribution

Beta testing

Finally my app was ready for beta testing. Yay!

Here are the steps for both platforms:

Android

  1. Compile the apk
  2. Email it to some friends
  3. There is no third step

iOS

  1. Collect UUID from friends
  2. Create provisioning profile from iOS dev portal
  3. Add UUID for each new test device
  4. Download provisioning profile from iOS dev portal
  5. Compile ipa
  6. Email provisioning profile and ipa

The most painful part is that I have to manually add each test device on the provisioning portal, and then download it to my local disk to compile the ipa. So tedious.

The flip side though, I know exactly who can run my app, and I don’t need to worry about leaks. For Android, once you send out an apk you have no idea where it will go. And there isn’t really a good way to limit the distribution.

Release

And now, the final moment - release to store. No anxiety for Android at all. Just upload, wait for an hour or so, and it’s live. For iOS, there is the review process.

I want to release Heart Collage before Valentine’s Day, so I submitted at the end of January. There should be plenty of time, but the potential rejection was stressing me out. I was so relieved when the app got approved the first try, in 6 days. Jubilation!

Verdict

I have been mostly pointing out the difference between iOS and Android. But at the end of the day, they are more similar than different. In terms of technology, at least. The verdict is still out on the money. Is it true that iOS users are more willing to pay for apps? Which platform will generate more revenue? That will be the driving force for my decision to spend time on iOS vs Android, and the numbers are still out. Will Heart Collage get more downloads on iOS or Android? We shall see.

Many thanks to Cyril Mottier and Tim Burks for reviewing the draft of this article.

Wanna check out Heart Collage? Download it here:

Friday, February 1, 2013

Heart Collage launched

Super excited that I just launched my app Heart Collage, just in time for Valentine's Day!

The app shows you how to pose for each part, snaps the shots one by one, and stitch them all together into a Heart Collage. I love watching people pose for the various parts of the heart - it's hilarious!

Get the app from Google Play or Apple App Store, and let me know what you think!

Tuesday, January 1, 2013

2012: Year of Speaking

Happy New Year! 2012 has been wonderful to me. I started out with the goal of becoming a public speaker, with the specific goal of 5 lightning talks and 3 full-length lectures. While I came short on the lightning talks (4 instead of 5), I went way beyond on the lectures (13 instead of 3), speaking at far away places like London, Bucharest and Antwerp. You can see all my talks at my speaking profile. (Yup, I have a speaker profile!)

Besides giving talks, I also wrote a few blog posts on speaking:

All in all, I consider it mission accomplished!

New Resolution?

I really enjoy public speaking, and will continue to do so in 2013 and beyond. I wanted to set a new goal for the new year, but I couldn't come up with a crisp one like last year. Here are a few of my candidates:

  1. Mentor others to speak at tech conferences
  2. Give a keynote speech
  3. Write a book

Mentoring

Mentoring others is a great goal, but I have difficulty coming up with a measurable result. Initially I thought, "put 3 people on stage", but I suspect I won't be the sole reason someone is speaking, so it is really hard to quantify.

Keynote

Keynote speech is an interesting goal. Keynotes open conferences, address to a large crowd with a diverse background, and have everybody take home a learning or two. It is quite a daunting task. My aspiration would be to give a keynote speech that is deeply rooted in my technical knowledge, but unveil general life lessons. I have no idea how I would write a speech like that, and even if I do, keynotes don't have open call for speakers, so I have no idea how I would be invited to deliver one. The lack of a game plan makes me uneasy, but perhaps that's the kind of goal I should shoot for?

Book

For the book, I'd like to write an advanced Android book that expands on my various talks. The main holdback for this goal is the time. I have spoken to many tech book authors, and all of them told me it is a lot of work. More work than you think, and then some more. I have so many projects going on that I am not sure if I should take on such a big goal.

No Resolution

In the end I decided I won't have a new year resolution. I will still mentor others to speak, think about my pie-in-the-sky keynote, and test the writing waters with a chapter or two. But I won't have a resolution with a font-size: 1.8em measurable result like last time ^_^

Sunday, December 9, 2012

AnDevCon IV

It's AnDevCon time again! My Android Custom Components talk was really popular at AnDevCon III, so I decided to give it again. On top of that, I had to a two-part session to discuss Android UI, both lecture and hands-on workshop.

Murphy's Law

I prepared my talks a few weeks ahead of time, so I was pretty chill about the conference. But as Murphy's Law dictates, anything that can go wrong will go wrong. On Monday of the conference week, my laptop died. My talks were on Wednesday and Thursday. Needlessly to say, I was completely thrown off course. I was so stressed that I had nightmares on Monday night.

Tuesday I took the laptop to the Apple Store, and they had to ship it out for the repairs. My slides were all online, but I still needed a machine to project them. Fortunately I found a friend who lent me a laptop, and I installed Photoshop trial for the demo in my session.

Since I didn't want to install too many new programs on my friend's computer, I took a break from programming on Tuesday night. Instead, I baked cookies to bring to the conference.

Same as AnDevCon III, I gave away bugdroid-shaped cookies to encourage participation, and people loved it.

Beautiful Android on a Shoestring

My first session was Beautiful Android on a Shoestring. I shared my experience in creating beautiful Android apps without knowing how to draw, introducing concepts like xml drawable, text shadow, shaders, custom fonts etc. I am especially proud of using icon fonts for scalable icons, a concept I borrowed from the web.

After showing the Android techniques, I switched gears and showcased my favorite website to get color schemes, fonts, icons etc:

I recorded the talk with my phone:

Unfortunately the video did not capture the slides. But fear not, I used Popcorn.js to embed the video to the slide deck. Play the embedded video on the top right corner, and the slides will advance automatically to match. Check it out: http://bit.ly/BeautifulAndroid

Hands-on Icon Creation

I led a hands-on Photoshop workshop right after Beautiful Android on a Shoestring. I know very little about Photoshop, but I felt that what I learnt from the Graphic Design for Engineers workshop was super useful, and I want to share that with other developers. I had a relatively small class, which was great because I could check and make sure everyone was following along.

Android Custom Components

The next day I gave my Android Custom Components talk. This is my fourth time giving this talk, so I was very comfortable with the material. Still, every talk is a live performance, and the audience is always different. I really enjoyed the interaction.

Since I already recorded the talk at AnDevCon III, I did not set up my video taping this time. Here is the recording from May:

Attending sessions

My three talks were clashing with a lot of the other sessions I wanted to attend, and I was a bit disappointed about it. Fortunately I managed to catch the Android concurrency talk, which was jam packed with information. I was live tweeting since it is too good to keep to myself! Here are a few of the tweets:

Great conference

Once again I had a great time at AnDevCon. It's really cool to hang out with so many Android developers. Definitely check it out if you work with Android!

Monday, December 3, 2012

Why Do I Speak At Conferences?

I have been asked by many people why I speak at conferences. I know I enjoy it tremendously, but it took me a while to pinpoint why. Here are my top three reasons.

Share Knowledge

First and foremost, I want to share what I know. As a developer I face new challenges every day, often scouring the internet for hours to figure out how to implement a new feature or get rid of that mysterious bug. I don't want my effects to go wasted. I keep a blog to share my findings, but sometimes I feel like I am talking to the void.

At a conference I have a live audience. I get instant feedback, perhaps a puzzled look that nudges me to explain in a different way, or a knowing smile that tells me I struck a chord. It is truly rewarding to see that sparkle of understanding, to know that you have made a difference.

Network effortlessly

After you give a talk, people come to you during lunch and coffee breaks. They heard you speak, thank you for the great talk, and want to discuss more. These conversations are way more interesting than your typical small talk, and I have met many wonderful people this way. As a speaker, networking becomes effortless because people come to me, and focused too, because they come to me knowing my interests. No more wandering aimlessly, shaking hands and collecting business cards without knowing why.

Be visible

As much as you would like to believe in a meritocratic society, unseen achievements are, by definition, not recognized. By stepping on stage and sharing your knowledge, you are seen as an expert. I know this at the back of my head, but I am still amazed by the wonderful opportunities that presented themselves to me since I started speaking.

One thing I did not expect about my visibility was that it was not just about me. I did not set out to defy the coder stereotype, but the truth is, I am not white, and I am not male. Every time I step on stage, I assert my identity as a software engineer, as a woman, as a speaker with a Cantonese-British-American accent, as someone who laughs at the smallest little thing, as myself. By being visible, I make it a bit easier for the next person who is working against the subconscious assumptions of what it means to be a software engineer, I push the envelope a little bit forward, towards a more diverse workforce in our industry.

Wanna speak?

I cannot believe that I only started speaking this year. And I cannot figure out why I never thought of doing it before. Perhaps you would like to give it a shot as well? Who knows, you may love it as much as I do.

There are many resources out there, We Are All Awesome and speakup.io being two. I have given a talk on how to come up with talk topics, and also written a blog post on how to be a confident speaker. I'd love to share more, so let me know if there is something specific you want to hear about!

Technically Speaking

Edit: I started the Technically Speaking newsletter with Cate Huston two years after writing this blog post. Great resources delivered to your inbox every week!

View archive on TinyLetter

Thursday, November 22, 2012

Android: Jelly Bean forward locks paid apps

Today, a Monkey Write user complained that sound wasn't working on the workbooks he bought. I sell workbooks as separate paid apps on Google Play, which supplies data to the main app. Most of the data is retrieved via a ContentProvider, but for the sound the main app simply reaches out to the other apk and loads the sound file from its assets:

Context workbookContext = context.createPackageContext(
    packageName, 0);
AssetFileDescriptor afd = workbookContext.getAssets().openFd(
    "pronunciations/" + soundFile + ".ogg");

This depends on the resources and assets of an app being world-readable. But from the logcat, it seems that the main app could not find the sound file in the asset folder of the workbook app:

java.io.FileNotFoundException: pronunciations/kou3.ogg
 at android.content.res.AssetManager.openAssetFd(Native Method)
 at android.content.res.AssetManager.openFd(AssetManager.java:331)

I searched and searched on the internet to no avail, until I verified with the user that sound works fine for the free workbooks he downloaded. By including the word "paid" in my queries, I found a StackOverflow post with this critical bit of information:

Paid apps on JB devices are 'forward locked'. This means that the APK is split in two parts -- one with public resources, and one with private ones and code, which is not readable by other apps. I haven't looked into how files are split in detail, but the problem you are seeing suggests that assets are part of the private APK.

More importantly, he provided a way to test forward locking without going through Google Play: adb install -l myapp.apk

With that, I managed to reproduce and fix the problem. Since the assets of the paid app is no longer world-readable, I use a ContentProvider to read the sound file and have the main app query for it. The user has downloaded the updates from Google Play and verified that sound now works for paid workbooks. Yay!