Friday, 17 June 2016

A Design Workflow Tutorial for Developers: Deliver Better UI/UX On Time

Working with a great designer or design team can be an invaluable asset to any team. With clear communication channels, and free-flowing co-operation, the designer should give you everything you need to speed up the building process and limit questions and confusion as much as possible.
What can you, the UX developer, do to ensure that the product you have built is delivered in a timely manner without sacrificing the quality of the user interface and user experience?

My answer: Get your designers involved from day one, and keep them involved throughout the entire UI/UX development process. Make sure to establish clear communication lines and consistent messaging between developers and the designers.

Do You Have Everything You Need?

The worst thing that can happen during the implementation of any UI is __lack of communication between the designer and the developer __(unless they’re the same person). Some designers think their job is done once the PSD is sent over. But, that’s just wrong! You must create an always-on communication workflow that lasts beyond the delivery of the PSDs.

Projects where the designer just submits the design files, and the developer just implements them, are the projects that just fail.

In many cases, it will take time before the designers see the actual UI/UX design implementation. To their surprise, the build is often completely different from the initial submission. (This happened to me more than once. I have sent over source files with complete descriptions and interaction prototypes, but when I finally saw the project, months later, it had a different layout, different colors, and no interactions in place.)
Some designers might hate me for this, as this design workflow requires a lot of “extra” work on their side. However, creating and delivering full assets and information, in an organized way, is better for the project and the team as a whole.
If a developer has everything that they need in front of them, it will speed up the process. A clean PSD is just not enough.
What do you need to get the job done effectively and efficiently?
These are the assets that a developer should expect from the designer to bring a UI/UX design to implementation:
·    Resource file - The designer should place every element of the app in one file. This file should contain buttons, checkboxes, header styles, fonts, colors, etc. Basically, based on the information in this file, the developer should be able to recreate any interface from scratch. It’s much easier for a developer to export any element from a single PSD, than to search multiple files for it.

·    Assets - Make sure that developers get all the required assets, as source files should not be touched anymore.

·    Interaction prototypes - Days of “static screens” are long gone. Using smart interactions and animations, to smooth-out UX design workflow and implementation, is a common practice now. But, you can’t just say “this will slide in from the left” to a developer. The designer should create the actual prototype of that interaction. The prototype should include information like speed, velocity, etc., and the designer is expected to specify each of these values.

·     Naming convention - Request a file naming structure to keep things organized. It’ll make it easier for both of you to navigate files. (No one likes to have things hidden in a background folder.)

·     HDPI Resources - We live in “hard times”, with the huge density of the screens. Make sure that the designer will deliver images in all of the required resolutions, so your application will look crispy everywhere. Note: use as many vectors as possible; it’s going to help you a lot (svg).

If you do find something else missing during the implementation, don’t be afraid; ping the designer and ask for it. Never skip, and never skimp! You are members of the same team, and your job is to deliver the best possible product. If a designer fails, you fail as well.

Work In-Progress

Utilize your designers during the UI/UX development process. Don’t keep them in the sidelines expecting them to just “push the pixels.” A designer sees possible innovations even before the implementation starts. To take advantage of this, keep them in the loop. Provide them with access to see, and test, the work in progress. I’m well aware that no one likes to share unfinished projects. But, it is much easier to make changes in the middle of a build than at the end. Doing so may save you time and prevent unnecessary work. Once you give the designer a chance to test the project, ask him to compile a list of problems and solutions, and suggest improvements.
What to do when a developer has an idea that would change the look of an application? Discuss it with thedesigner, and never allow a developer to modify the design, without consulting the designer. This design workflow will assure that the build stays on track. A great designer has a reason for every element on the screen. Taking a single piece out, without understanding why it’s there, could ruin the user experience of the product.

UI/UX Design Project Management

Designers think that developers can bring a design to life in one day, or even in one hour. But, like great design, great development takes time and effort. Keep your anxious designer at bay by letting him see the progress of the build. Using external project management software, to make sure every revision is accounted for, is a great way to make sure you don’t miss important information discussed in an email conversation or a Skype session. And let’s be honest: sometimes changes and activities aren’t even communicated until they happen.
Whatever solution you use, be sure to choose one workflow process that the whole team will adopt and consistently use. On our team, I tried to push Basecamp because that’s what I was using, but our front-end developers thought it had limited features. They were already using other project management software to track bugs, progress, etc., such as JIRA, GitHub, and even Evernote. I understood that project tracking and management should be kept as simple as possible, so I migrated my UI design workflow to JIRA. I wanted make sure they understood my workflow and progress, but I did not want them to feel like design was another thing to manage.
Here are a few suggestions for a project management tool:
·     Basecamp - Tracks the progress of the design and development related tasks, and easily lets you export tasks. It also has a simple mobile client.

·     JIRA - A fully customizable platform where you can easily set up custom boards for different areas. For example, organize boards to track activities such as back-end, front-end, design, etc. I think the mobile client is a bit weak, but it is a great solution for bigger teams and includes a bug tracking feature.

·     Email - This is great for setting up a conversation or sending images. But please be careful if you use email for feedback. Things can easily get lost.
You can also try Trello and other project management software, but the most widely used in our industry are Basecamp and JIRA. Again, the most important thing is to find a project management system that everyone can use on a consistent basis, as otherwise it’s a moot point.

UX Design And Development Come Together

The designer and the developer are a powerful combination. Be sure to brainstorm UI and UX together as often as possible. Developers should be willing to help a designer conceive ideas, while a designer should have at least a basic knowledge of the technology that is being used.
Figure out the design workflow together. Don’t just blindly implement what your designers create. Be proactive, and create something that looks beautiful and has a great user experience, by taking advantage of your two different perspectives. Designers think outside of the box and see crazy animations, ideas, pixels, and buttons, while developers see the technology, speed bumps, and limits.

In my experience, every designer is crazy about pixels and interesting concepts. But sometimes, a designer gets to a point when they have an idea, but the developer pushes back and says, “This isn’t going to work well once it’s implemented. There will be performance consumption issues”. Recently, I was looking to implement a modal window with a blurred background, but this blur caused heavy loading times. To solve this problem, the developer suggested using a regular, full color overlay, which loads faster and retains image quality. Designers, pay attention: Don’t compromise the user experience for the design.

Feedback Loop

Feedback from the designer is crucial, and it has to happen as often as possible. It’s probably the most time (and energy) consuming thing that you will do. But, you need to adopt it to be able to deliver perfect results. Here are couple of UX and UI design worklow tips on how to make your feedback perfect.
·    Be visual - Feedback needs to be as specific as possible. The best way to make it accurate is to take a simple screenshot and highlight a problem you want to fix. It would be even better if you had a pictures of a current implementation vs how it is supposed to look. Visual communication will eliminate 50% of the questions.

·     Be descriptive - Feedback should be accurate. You can’t just say “move this button up”. The designer must specify how many pixels a button should move, what padding should be used etc. Always include an explanation of the problem, and the appropriate solution for it. It’s going to take a lot of time, but it’s worth it.
     Be patient - Keep in mind that the designer and the developers do not share the same focus. If developers don’t fully understand a designer’s idea, it can lead to confusion and bad decisions. In every case, both sides need to be patient and willing to help the other team members. It’s really tough sometimes, but it is a soft skill that every designer and developer should learn.

It’s pretty obvious that these things needs to be combined together to make them into a suitable design workflow. But, what tool can actually help you deliver the feedback?
·         Email - I’m not afraid to say that this is still the most common platform to deliver the feedback. It is totally fine to use it, if you will follow couple of simple rules.
o    First, use single email thread for your feedback. Don’t put every individual tweak to a new email with a different subject.
o    Second, create the list of fixes. Try to sit down and think about every tweak or fix you noticed.
o    And lastly, don’t send huge list at once. Try to break it down to smaller individual lists and go part by part.

·     Skype (Hangouts) - Voice is really powerful tool for the feedback. You can immediately ask and answer questions. But, make sure to take notes and sent over the follow up message (email) after the call.

      Collaboration tools - To be honest, I’m not a big fan of collaboration tools. But, they have a big benefit. They help you to keep feedback in place. Asking and answering questions is fast, and it stays there forever.

Here are some of the great tools:
Feedback annotation:
Collaboration tools:


Establish a system and UI/UX design workflow that keeps the communication lines open throughout the design and development process. This will allow you to implement great ideas, forecast potential problems, and prioritize important issues.
The developer and the designer can create great things together as long as they are willing to work as ateam. Learn from each other and design tutorials like this one!

About Author


Tuesday, 7 June 2016

Trending areas for UX designing

In this technology world, we are moving very fast. Our life style and behavior are changing and which is bringing more opportunities for creation of new products, services, solutions, gadgets etc. With the evolution of Internet companies have explored a larger area of business and this area is going to be getting more expanded with the evolution of some amazing and near future techniques. More innovations have happened at technology side now more imagination is happening at designing side. In this article we are going to see some cool areas for User Experience designer to work in near future.

Internet of things

The internet of things (IoT) is the network of physical objects—devices, vehicles, buildings and other items—embedded with electronics, software, sensors, and network connectivity that enables these objects to collect and exchange data.

Connected Home, cars, workstations, offices, cities are going to boom our current environment and experience.

Virtual Reality

Virtual reality or virtual realities (VR), also known as immersive multimedia or computer-simulated reality, is a computer technology that replicates an environment, real or imagined, and simulates a user's physical presence and environment to allow for user interaction. Virtual realities artificially create sensory experience, which can include sight, touch, hearing, and smell.
Most of the companies have already worked in this section and started creating VR products. Google cardboard and Microsoft Holo lenses are some well known products of this area.

Augmented Reality

Augmented reality (AR) is a live direct or indirect view of a physical, real-world environment whose elements are augmented (or supplemented) by computer-generated sensory input such as sound, video, graphics or GPS data.


Wearable technology brings data and activity tracking from user’s environment and converting it in valuable information, actions, recommendations etc. Different fitness related wearable products and utilities related products are emerging in market frequently. Example: Apple watch, fitbit bands etc.


Near field communication (NFC) is a set of communication protocols that enable two electronic devices, one of which is usually a portable device such as a smartphone, to establish communication by bringing them within 4 cm (2 in) of each other.

NFC is being used majorly in access provisioning, shopping, retail domains etc. but not limited to these domains only. 

UX & Virtual Reality - Designing for interfaces without Screens

Virtual Reality

It’s an experience that’s been around since the mid­ 80s, but technology always seemed to hold it back. The advances in smartphones and related technologies have finally brought the incredible potential of VR within reach. Now, we’re in the midst of a virtual reality revolution. The concept was coined around 1955 and so many years later VR is back in a big way with Oculus, Samsung Gear VR, Project Morpheus, Google Cardboard, HTC Vive, OSVR, and other smaller or yet to be announced players. The well-known tech giants Facebook, Google and Microsoft are keenly investing in VR which indirectly proves that it is going to be the game changer of this century.

What are they trying to do with VR?

It’s really just immersive software. You know how your phone is a tiny screen that you sometimes ignore? Virtual reality is pretty much the opposite. It uses a headset (a big pair of glasses) that fills your entire field of view with an image. You turn your head left, you see left. Turn your head right, you see right. You will be framed inside a virtual world with virtual things with which you can interact, play, design and experience.

The VR Process

Designing for a flat 2D screen versus designing for 3D Virtual Space has its own challenging factors. Achieving the best user experience in VR Devices is the key success of the entire concept. As it is a combination of various factors such as Head Movement Tracker, Eye Tracker, Gesture Capture, Mind Map etc., making all these sync together and binding them perfectly with the design and visuals of your application takes a lot of effort and thought process.  

Who can utilize VRs?

Everyone. Yes, VR Headsets are of 3 categories affordable for all set of people around the world. Every single application that you are using in your mobile phones and computers can be designed for Virtual Reality. There is a big misconception among the people saying that VR is favored only for Game Development, which is totally wrong. Interior Designers, Doctors, Industrial Designers, E-Commerce, Banking and every other random line of business can use Virtual Reality for their work.

1. The low-end entry level headset. It’s actually just a fancy smartphone case. You slip your phone into pair of lenses that strap onto your head like a scuba mask, and there you go, you’re into the VR world! You can build these things out of plastic, or even, as Google demoed some years back, Google Cardboard. Samsung has one such model on the market today for $200.

2. The mid-range headset. It’s totally self-contained, like an Oculus Rift or Sony's Project Morpheus, with its own display and probably some headphones. Think of it as a really nice TV or computer monitor for your face. Maybe you plug it into a phone or a PC to play games or watch movies. Oculus which is acquired by Facebook is selling its latest dev kit.

3. The Augmented Reality. It is one step ahead of the Virtual Reality where we are binding the real world visuals with virtual stuffs. Imagine, you walk on the road and you can see the visuals, pins, navigations of the Google Map on your path. Two Big companies, Microsoft with its HoloLens and a headset by Magic Leap are trying to accomplish this concept.

UX Principles for designing Virtual Reality

1. Everything Should Be Reactive 
Every interactive object should respond to any casual movement. For example, if something is a button, any casual touch should provoke movement, even if that movement does not result in the button being fully pushed. When this happens, the haptic response of the object coincides with a mental model, allowing people to move their muscles to interact with objects. When designing a button: use a shadow from the hand to indicate where the user’s hand is in relation to button, create a glow from the button that can be reflected on the hand to help understand the relationship, use sound to indicate when the button has been pressed (“click”) 

2. Restrict Motions to Interaction
The display should respond to the user’s movements at all times, without exception. Even in menus, when the game is paused, or during cut scenes, users should be able to look around. Avoiding Simulator Sickness and slowness is the key part of improving the UX in Virtual Reality Applications. Do not instigate any movement without user input. Reduce neck strain with experiences that reward a significant degree of looking around. Try to restrict movement in the periphery.

3. Text and Image Legibility
Bigger, brighter and bold texts should be used to indicate widgets. Images should be realistic and appealing to the user. The mind of the user is going to be entirely mapped into the virtual reality for a prolonged amount of time. Texts should be readable and legible for unstrained viewing of the user. Brighter and vivid the colors are, more involved the users will be.

4. Ergonomics
Designing based on how the human body works is an essential to bringing any new interface to life. Our bodies tend to move in arcs, rather than straight lines, so it’s important to compensate by allowing for arcs in 3D space

5. Sound Effects
Sound is an essential aspect of truly immersive VR. Combined with hand tracking and visual feedback, it can be used to create the “illusion” of tactile sensation. It can also be very effective in communicating the success or failure of interactions.

Google’s Design Guidelines for Virtual Reality

Google has listed some key principles involving physiological and ergonomics  consideration to be noted while designing for Apps that can run on Google Cardboard. They are pretty much straight-forward for the designers to understand. 

1. Using a Reticle
2. UI Depth & Eye Strain
3. Using Constant Velocity
4. Keeping the User Grounded
5. Maintaining Head Tracking
6. Guiding with Light
7. Leveraging Scale
8. Spatial Audio
9. Gaze Cues
10. Make it Beautiful

Google’s Cardboard Guidelines, Best Practices for Designing Oculus Rift

About Author 
With 3 Years of Professional Experience in Design and technology, I have a great passion for UX Design, Usability Testing and User Research. With a formal knowledge of Design Process, I prototype Interactive and Intuitive Designs for Desktops, Mobiles and Wearable Technologies. 

Monday, 6 June 2016

The Psychology of Wearables and Wearable Technology

In recent years we’ve seen new, disruptive innovations in the world of wearable technology; advances that will potentially transform life, business, and the global economy. Products like Google Glass, Apple Watch, and Oculus Rift promise not only to change the way we approach information, but also our long established patterns of social interaction.
Indeed, we are witnessing the advent of entirely new genre of interface mechanisms that brings with it a fundamental paradigm shift in how we view and interact with technology. Recognizing, understanding, and effectively leveraging today’s growing landscape of wearables is likely to be increasingly essential to the success of a wide array of businesses.
In this article, we discuss the ways in which effective interface design will need to adapt, in some ways dramatically, to address the new psychology of wearable technology.

Enter the Neuroscientific Approach

Cognitive neuroscience is a branch of both psychology and neuroscience, overlapping with disciplines such as physiological psychology, cognitive psychology, and neuropsychology. Cognitive neuroscience relies upon theories in cognitive science coupled with evidence from neuropsychology and computational modeling.
In the context of interface design, a neuroscientific approach is one which takes into account – or more precisely, is centered upon – the way in which users process information.

The way people interact with new, not-seen-before technologies is more bounded to their cognitive processes than it is to your designer’s ability to create stunning UI. New, often unpredictable, patterns emerge any time a person is presented with a tool, a software or an action that he has never seen before.
Accordingly, rather than employing more traditional approaches (such as wireframing and so on), you will instead focus on the sole goal of your product, the end result you want the user to achieve. You will then work your way back from there, creating a journey for the user by evaluating the how to best align the user’s intuitive perception of your product and his or her interaction with the technology used. By creating mental images, you won’t need to design every step the user has to take to accomplish an action, nor you will have to evaluate every possible feature you could or couldn’t include in the product.
Consider, for example, Google Glass Mini Games. In these 5 simple games made by Google to inspire designers and developers, you can see exactly how mental images play a major role in user engagement with the product. In particular, the anticipation of a future action comes to the user with no learning curve needed. When the active elements of the game pop up into view, the user already knows how to react to them and thus forms an active representation of the playing environment without the need to actually have one to see. Not only has the learning curve has been reduced to a minimum, but the mental images put the user in charge of the action immediately, anticipating what the user will do and just letting the user do it.
Bear in mind that is possible to identify three different types of images that form in the brain at the time of a user interaction, all of which need to be adequately considered and addressed to achieve an effective, and sufficiently intuitive, interface. These include:
  1. Mental images that represent the present
  2. Mental images that represent the past
  3. Mental images related to a projected potential future

And don’t worry. You don’t need to run a full MRI on your users to test what is going on in their brain to arrive at these mental images. Rather, you can simply test the effectiveness and universality of the mental images you’ve built.

Users Do What Users Do

When approaching a new technology, it’s vital to understand how users experience and relate to that technology. In particular, a reality check is often needed to recognize how users actually use the technology in spite of how they’re “supposed to” (or expected to) use it. Too many times we’ve seen great products fail because businesses were expecting the users to interact with them in a way that in reality never occurred. You shouldn’t jump on the latest, fancier technology out there and build (or, worse, re-shape!) your product for that technology without knowing if it will actually be helpful to, and adapted by, your users. This is an easy mistake and it’s quite eye-opening to see the frequency with which it occurs.

Leveraging Multiple Senses in Wearables

Wearables bring the great advantage of being way more connected to the user’s physical body than any smartphone or mobile device could ever hope for. You should understand this from the early stage of your product development and stop focusing on just the hand interaction. Take the eyes for example. Studies conducted with wearable devices in a hands-free environment have shown that the paths users follow, when their optical abilities are in charge, are different from the ones you would expect. People tend to organize and move in ways that are due to their instinctive behavior in spite of their logical ones. They tend to move instinctively towards the easier, faster paths to accomplish that action, and those paths are never straight lines.
One application that effectively leverages multiple senses is the Evernote app for the Apple Watch. Actions in the Watch version of the application have the same goals as their desktop/mobile counterparts, but are presented and accomplished in totally different ways. With a single, simple button click, you can automatically access all of the feature of the app: you don’t need multiple menus and differentiation. If you start talking, the application immediately creates a new note with what you’re dictating, and syncs it with your calendar. As a user, you are immersed in an intuitive experience here that lets you be in charge of what you’re doing, while presenting you with an almost GUI-free environment.
And what about our more subtle, cognitive senses? Wearables bring the human part of the equation more fully into account with a deeper emotional connection: stress, fear and happiness are all amplified in this environment. You should understand how your product affects those sensations and how to avoid or take advantage of those effects.
Just remember: let the cognitive processes of the users lead and not the other way around.

Voice User Interface (VUI)

In the past, designing a Voice User Interface (VUI) was particularly difficult. In addition to all the challenges in the past with voice recognition software, VUIs also present a challenge due to their transient and invisible nature. Unlike visual interfaces, once verbal commands and actions have been communicated to the user, they are not there anymore. One approach that’s been employed with moderate success is to give a visual output in response to the vocal input. But still the designing of the user experience for these types of devices presents the same limitations and challenges of the past, so we’ll try to give a brief overview here of what people like and don’t like about VUI systems and some helpful design patterns.
For starters, people generally don’t like to speak to machines. This might be a general assumption but it is even more true if we consider what speaking is all about. We interact with someone taking in consideration that the person can understand what we’re saying or at least has the “tools” and “abilities” to do so. But even that is not generally sufficient. Rather, speaking with someone typically involves a feedback loop: you send out a message (carefully using words, sounds, and tones to help ensure that what you say is properly understood in the way you intended). Then the other person receives the message and hopefully provides you with some form of feedback that hopefully confirms proper understanding.
With machines, though, you don’t have any of this. You will try to give a command or ask for something (typically in the most metallic voice you can muster!) and hope for the machine to understand what you’re saying and give you valuable information in return.
Moreover, speech as a means for presenting the user with information is typically highly inefficient. The time it takes to verbally present a menu of choices is very high. Moreover, users cannot see the structure of the data and need to remember the path to their goal.
The bottom line here is that these challenges are real and there are not yet any “silver bullet” solutions that have been put forth. In most cases, what has been proven to be most effective is to incorporate support for voice interaction, but to limit its use to those places where it is most effective, otherwise augmenting it with interface mechanisms that employ the other senses. Accepting verbal input, and providing verbal feedback, are the two most effective ways to incorporate a VUI into the overall user experience.


While designing for wearable tech, remember that you will find yourself in a different, unusual habitat of spaces and interactions that you’ve probably never confronted before (and neither have most of your users).
Grids and interaction paths, for example, are awesome for websites and any other setting that requires huge amount of content to be handled. With wearable devices, though, you have limited space for interaction and should rely on the instinctive basis of the actions you want to implement to give the best experience to the users.
Let’s take the Apple Watch for example. For one thing, you will only be able to support one or two concurrent interactions. Also, you don’t want the user to need to constantly switch between tapping on the screen and scrolling/zooming the digital crown on the side of the device. frankly, even Apple itself made mistakes in this regard. In handling the Watch’s menu interface, for example, to safely tap an icon on the menu, you will need to zoom-in and out most of the time using the crown while briefly distracting yourself from the direct task you wanted to accomplish.
A great example of effective micro-interaction design can be found in the Starwood Hotels and Resorts app for the Apple Watch. The Starwood application for Apple Watch perfectly fits their personal brand experience by letting users unlock their room door in the hotel by the simple tap of a button. You don’t need to see the whole process going on to enjoy what this kind of micro-interaction can do. You’re in the hotel and you want to enter your door without going in your bag or pocket looking for the actual key. The app also shows one of the best practices for wearables: the selective process. Don’t put more actions or information that you should, otherwise it will disrupt the experience (like in the Apple menu example). Rather, just show the user the check-in date and the room number. When they click “unlock” a physical reaction occurs, outside the device and into the real world.

The KISS Principle

The well known KISS Principle is perhaps even more relevant in the domain of wearables than it is with more traditional user interface mechanisms. The Wishbi ShowRoom app for Google Glass is a great working example of a light UI that enriches the user experience without “getting in the way”. It has already been incorporated by companies like Vodafone and Fiat. Basically, it facilitates streaming online content live, even at different quality rates. And it does that with two simple actions; one that helps you start the broadcasting, and a split screen that captures everything you’re seeing. For an action as complicated as live broadcasting, the app manages to be extremely lightweight and unobtrusive.

Wearable Technology Conclusions

Remember, every interface should be designed to empower and educate the user to perform a desired activity more quickly and more easily. This is true for every interface platform, not just wearables.
But that said, please DO go crazy! Wearable technology is a revolutionary field, and even though you can look out for principles and patterns for a safer job, you should always make some room for crazy, playful ideas that won’t even make sense. You can make your own answers here.
Wearable tech interfaces represent a wide open playing field. Have at it!
This article was originally published on Toptal


About Author:

Antonio Autiero is a Software Engineer at Toptal. Antonio is a digital art director and UX designer with experience in information architecture, brand development, and business design. He has been working for over 10 years all around the world with amazing clients such as Nike, Rolex, Ferrari, and more. He lives in Gubbio, Italy where he's surrounded by beautiful countryside, and he always likes to meet nice people.