top of page
LogoBlack_2x.png

AWS Batch Job Definition Form

UX Case Study

Frame 4408.png
Frame 4410.png

Challenge: Improve the user flow and interface of the AWS Batch Job Definition Form 
Deliverables: Audit Documentation, Hi-Fidelity Mockups

Frame 4412.png

Background

AWS Batch is a cloud computing service tool that allows users to run and monitor thousands of batch computing jobs efficiently. Jobs are tasks that users- such as engineers, developers, scientists, etc- run in order to complete a desired task.
 
Job definitions specify the parameters a job will run on, such as - how many vCPU'S, how much memory is used, which command will run, etc. When creating a job, a user must first assign it to a job definition. Users create job definitions within the Job Definitions page on the console.​

Problem

It is important to note that the AWS Batch console was built strictly by developers, without any UX design input. Therefore, the Job Definition form has a myriad of UX and UI problems that need to be addressed in order to improve the usability and efficiency of the form.

DESIGN PROCESS

I started off by creating a user flow of the common path a user would take when creating a job definition. This gave me a better understanding of the form holistically, and the intricacies of each step.

Group 2.png

Auditing

Now that I have a better idea of each action a user takes to create a job definition, I delved deep into auditing each page of console form. 

Screenshot 2024-05-21 at 1.08.52 AM.png

It's important to note that the form changes based on which option a user clicks on step 1 (windows or linux), therefore annotations had to be created for both options.

I highlighted mistakes I found throughout each page/step. Throughout this audit, I noticed that a bulk of the issues were surrounding the actual interface, whilst the flow of the form had minor issues. Nevertheless, both types of issues made an impact on the overall user experience of the form.

Group 1.png

Whilst making these annotations, I looped in the dedicated engineer on this project early on to ensure that my desired design changes made sense on a technical standpoint. I also collaborated with our copywriter to improve on the text throughout this form.

DESIGN

After completing all annotations and copywriting improvements, I created wireframes to reflect these new changes.

Screenshot 2024-05-21 at 2.22.36 AM.png

To make everything clear and concise, I took screenshots of a page on the console, and numbered each annotation change. Once I built out the improved wireframe for that page, I placed the annotated number where the change was made. This way, the devs would know exactly where the improved changes reside.

Group 1.png
Group 3 (1).png

I shared these final iterations with the dev team, made a few tweaks based on feedback that I received, and placed these designs in the dev backlog.

A CLOSER LOOK

I developed a total of 16 wireframe concepts, 4 wireframes per each step/page. As previously mentioned, the form changes based on which option a user clicks on step 1 (windows or linux), therefore wireframes had to be created for both options.

Let's take a closer look at each of the audit errors I solved for the Linux user flow. 

step 1

Group 1.png
Group 3 (1).png

step 2

Group 4 (1).png
Group 6.png

step 3

Group 7.png
Group 8.png

step 4

Group 9 (1).png
Group 10 (1).png

What would i do differently?

Before starting the design process, I would've held user interviews to get a better understanding of pain-points from the users perspective. This would have gave me more direct insights to solve for rather than just a simple audit.

Due to time constraints and project prioritizations, we weren't able to hold these interviews.

Thanks for reading!

bottom of page