In an age of over overconsumption and information overload, it is often that users are never taught how to manage all this information. Bringing payments into the picture just complicates things because now there is a tangible cost to what decisions users make.
How might we simplify the management of multiple subscription services?
The research process began with first understanding the subscription economy by way of primary research. Topics of research covered payment structures, content delivery, models of variable payment structures as well as an environment scan of the subscription economy and available services.
Following which secondary research was acquired by conducting interviews, shadowing users and auto-ethnography. These were the insights that were drawn from the research:
There were 2 main challenges being explored:
The ChatGPT challenge was explored based on 2 references: Cultural & Procedural. The way to do this was to utilize GPT's ability to write code, particularly Python.
This python script is then imported into blender to generate a scene and animations based on the code. The prompts were based on the following:
Sierpiński Triangle is a fractal created by repeatedly subdividing an equilateral triangle and removing the central one, yielding a nested, self-similar pattern. This visual recursion illustrates how a simple rule can produce increasingly intricate designs at each level of iteration.
Perlin noise is a procedural function that generates smoothly varying random patterns by interpolating local gradients. Its natural-looking randomness makes it ideal for creating realistic textures and landscapes in computer graphics.
The second challenge was based on the discovery of a tool, FlowFrames.
The intent was to see the capabilities, performance and advantages of this tool for independent animators to optimize their workflow.
Prompt: Create a script for blender: 1 row of 10 tubelights spaced out equally all lying parrallel to the Y axis inspired by artist Dan Flavin. Make the tubelights move in a wave pattern. Give the tubelights an emissive material that cycles through all the colours sequentially through the tubelights
1import bpy
2import math
3
4# Function to create a tube light with emissive material
5def create_tube_light(name, location, color):
6 bpy.ops.mesh.primitive_cylinder_add(vertices=32, radius=0.1, depth=2, end_fill_type='NGON', location=location, rotation=(math.pi/2, 0, 0))
7 tube = bpy.context.object
8 tube.name = name
9
10 # Add emission shader
11 mat = bpy.data.materials.new(name="EmissionMaterial_{}".format(name))
12 mat.use_nodes = True
13 mat.node_tree.nodes.remove(mat.node_tree.nodes.get('Principled BSDF'))
14
15 emission = mat.node_tree.nodes.new(type='ShaderNodeEmission')
16 emission.inputs["Strength"].default_value = 5
17 emission.inputs["Color"].default_value = (*color, 1.0)
18
19 output = mat.node_tree.nodes.get('Material Output')
20 mat.node_tree.links.new(output.inputs[0], emission.outputs[0])
21
22 tube.data.materials.append(mat)
23 return tube, emission
24
25# Create tube lights
26num_tube_lights = 10
27spacing = 3
28tube_lights = []
29
30# RGB colors for the tube lights
31colors = [
32 (1, 0, 0), (1, 0.5, 0), (1, 1, 0), (0.5, 1, 0),
33 (0, 1, 0), (0, 1, 0.5), (0, 1, 1), (0, 0.5, 1),
34 (0, 0, 1), (0.5, 0, 1)
35]
36
37for i in range(num_tube_lights):
38 x = i * spacing
39 color = colors[i % len(colors)]
40
41 tube_light, emission_node = create_tube_light("Tube_{}".format(i), (x, 0, 0), color)
42 tube_lights.append((tube_light, emission_node))
43
44# Animate tube lights in a wave pattern
45for frame in range(0, 375):
46 bpy.context.scene.frame_set(frame)
47
48 for i, (tube_light, emission_node) in enumerate(tube_lights):
49 x = i * spacing
50
51 # Slow down the wave animation by 50%
52 z = 2 * math.sin(x * 0.5 + frame * 0.025)
53 tube_light.location.z = z
54 tube_light.keyframe_insert(data_path="location", index=2, frame=frame)
55
56 # Change color
57 color_index = (i + (frame // 20)) % len(colors)
58 emission_node.inputs["Color"].default_value = (*colors[color_index], 1.0)
59 emission_node.id_data.keyframe_insert(data_path='nodes["Emission"].inputs[0].default_value', frame=frame)
60
Prompt: Update the script to make it a seamless loop for 15 seconds at 25 frames per second. Spread out colors more evenly amongst tube lights and make the transitions between colours smoother
The below animation includes v1 beside v2
1import bpy
2import math
3
4# Function to create a tube light with emissive material
5def create_tube_light(name, location, color):
6 bpy.ops.mesh.primitive_cylinder_add(vertices=32, radius=0.1, depth=2, end_fill_type='NGON', location=location, rotation=(math.pi/2, 0, 0))
7 tube = bpy.context.object
8 tube.name = name
9
10 # Add emission shader
11 mat = bpy.data.materials.new(name="EmissionMaterial_{}".format(name))
12 mat.use_nodes = True
13 mat.node_tree.nodes.remove(mat.node_tree.nodes.get('Principled BSDF'))
14
15 emission = mat.node_tree.nodes.new(type='ShaderNodeEmission')
16 emission.inputs["Strength"].default_value = 5
17 emission.inputs["Color"].default_value = (*color, 1.0)
18
19 output = mat.node_tree.nodes.get('Material Output')
20 mat.node_tree.links.new(output.inputs[0], emission.outputs[0])
21
22 tube.data.materials.append(mat)
23 return tube, emission
24
25# Create tube lights
26num_tube_lights = 10
27spacing = 3
28tube_lights = []
29
30# RGB colors for the tube lights
31colors = [
32 (1, 0, 0), (1, 0.5, 0), (1, 1, 0), (0.5, 1, 0),
33 (0, 1, 0), (0, 1, 1), (0, 0.5, 1), (0, 0, 1)
34]
35
36# Animation settings
37fps = 25
38total_seconds = 15
39total_frames = fps * total_seconds
40
41# Wave settings
42wave_length = total_frames # A wave completes in 15 seconds
43amplitude = 2
44
45for i in range(num_tube_lights):
46 x = i * spacing
47 color = colors[i % len(colors)]
48
49 tube_light, emission_node = create_tube_light("Tube_{}".format(i), (x, 0, 0), color)
50 tube_lights.append((tube_light, emission_node))
51
52# Animate tube lights in a wave pattern
53for frame in range(total_frames + 1):
54 bpy.context.scene.frame_set(frame)
55
56 for i, (tube_light, emission_node) in enumerate(tube_lights):
57 x = i * spacing
58
59 # Move in a wave formation
60 z = amplitude * math.sin(2 * math.pi * (frame / wave_length + x / (num_tube_lights * spacing)))
61 tube_light.location.z = z
62 tube_light.keyframe_insert(data_path="location", index=2, frame=frame)
63
64 # Smoothly change color
65 color_phase = 2 * math.pi * (i / num_tube_lights + frame / total_frames)
66 r = math.sin(color_phase) * 0.5 + 0.5
67 g = math.sin(color_phase + 2 * math.pi / 3) * 0.5 + 0.5
68 b = math.sin(color_phase + 4 * math.pi / 3) * 0.5 + 0.5
69 emission_node.inputs["Color"].default_value = (r, g, b, 1.0)
70 emission_node.id_data.keyframe_insert(data_path='nodes["Emission"].inputs[0].
71
Prompt: Update the script to make it a seamless loop for 15 seconds at 25 frames per second. Spread out colors more evenly amongst tube lights and make the transitions between colours smoother.
Along with the updated code, I added an Array Modifier on the X and Y axis at a factor of 5 to create a sheet of 25 lights per colour. I wanted to expand on the visual and create a sense of size the way Flavin did in some of his work.
1import bpy
2import math
3
4# Function to create a tube light with emissive material
5def create_tube_light(name, location, color):
6 bpy.ops.mesh.primitive_cylinder_add(vertices=32, radius=0.1, depth=2, end_fill_type='NGON', location=location, rotation=(math.pi/2, 0, 0))
7 tube = bpy.context.object
8 tube.name = name
9
10 # Add emission shader
11 mat = bpy.data.materials.new(name="EmissionMaterial_{}".format(name))
12 mat.use_nodes = True
13 mat.node_tree.nodes.remove(mat.node_tree.nodes.get('Principled BSDF'))
14
15 emission = mat.node_tree.nodes.new(type='ShaderNodeEmission')
16 emission.inputs["Strength"].default_value = 5
17 emission.inputs["Color"].default_value = (*color, 1.0)
18
19 output = mat.node_tree.nodes.get('Material Output')
20 mat.node_tree.links.new(output.inputs[0], emission.outputs[0])
21
22 tube.data.materials.append(mat)
23 return tube, emission
24
25# Create tube lights
26num_tube_lights = 10
27spacing = 3
28tube_lights = []
29
30# RGB colors for the tube lights
31colors = [
32 (1, 0, 0), (1, 0.5, 0), (1, 1, 0), (0.5, 1, 0),
33 (0, 1, 0), (0, 1, 1), (0, 0.5, 1), (0, 0, 1)
34]
35
36# Animation settings
37fps = 25
38total_seconds = 15
39total_frames = fps * total_seconds
40
41# Wave settings
42wave_length = total_frames # A wave completes in 15 seconds
43amplitude = 2
44
45for i in range(num_tube_lights):
46 x = i * spacing
47 color = colors[i % len(colors)]
48
49 tube_light, emission_node = create_tube_light("Tube_{}".format(i), (x, 0, 0), color)
50 tube_lights.append((tube_light, emission_node))
51
52# Animate tube lights in a wave pattern
53for frame in range(total_frames + 1):
54 bpy.context.scene.frame_set(frame)
55
56 for i, (tube_light, emission_node) in enumerate(tube_lights):
57 x = i * spacing
58
59 # Move in a wave formation
60 z = amplitude * math.sin(2 * math.pi * (frame / wave_length + x / (num_tube_lights * spacing)))
61 tube_light.location.z = z
62 tube_light.keyframe_insert(data_path="location", index=2, frame=frame)
63
64 # Smoothly change color
65 color_phase = 2 * math.pi * (i / num_tube_lights + frame / total_frames)
66 r = math.sin(color_phase) * 0.5 + 0.5
67 g = math.sin(color_phase + 2 * math.pi / 3) * 0.5 + 0.5
68 b = math.sin(color_phase + 4 * math.pi / 3) * 0.5 + 0.5
69 emission_node.inputs["Color"].default_value = (r, g, b, 1.0)
70 emission_node.id_data.keyframe_insert(data_path='nodes["Emission"].inputs[0].
71
After a lot of trial and error, I was able to get the below code to do what I had intended and as a result the video alongside. The below prompt is a good place to start to recreate the animation.
Prompt: Consider yourself an expert Python developer for Blender. I want a single script that creates a 3D Sierpinski pyramid fractal with the following requirements:
Please give me the complete code that accomplishes all these points in Blender.
1import bpy
2import mathutils
3import math
4
5def create_pyramid(size, location):
6 height = size # Set height equal to the base length
7
8 # Define vertices for an equilateral pyramid
9 verts = [
10 mathutils.Vector((-size/2, size*math.sqrt(3)/6, 0)),
11 mathutils.Vector((size/2, size*math.sqrt(3)/6, 0)),
12 mathutils.Vector((0, -size*math.sqrt(3)/3, 0)),
13 mathutils.Vector((0, 0, height))
14 ]
15
16 edges = []
17 faces = [(0, 1, 2), (0, 1, 3), (1, 2, 3), (2, 0, 3)]
18
19 mesh_data = bpy.data.meshes.new("pyramid")
20 mesh_data.from_pydata(verts, edges, faces)
21 mesh_data.update()
22
23 obj = bpy.data.objects.new("Pyramid", mesh_data)
24 bpy.context.collection.objects.link(obj)
25 obj.location = location
26
27 return obj
28
29def sierpinski_pyramid(level, size, location):
30 if level == 0:
31 return [create_pyramid(size, location)]
32
33 half_size = size / 2
34 height_offset = half_size
35 offset = mathutils.Vector((0, 0, height_offset))
36
37 pyramids = []
38 for i in range(3):
39 angle = i * math.radians(120)
40 rotation_matrix = mathutils.Matrix.Rotation(angle, 4, 'Z')
41 new_location = location + rotation_matrix @ mathutils.Vector((half_size, 0, 0))
42 pyramids.extend(sierpinski_pyramid(level - 1, half_size, new_location))
43
44 pyramids.extend(sierpinski_pyramid(level - 1, half_size, location + offset))
45 return pyramids
46
47# Clear existing mesh objects
48bpy.ops.object.select_all(action='DESELECT')
49bpy.ops.object.select_by_type(type='MESH')
50bpy.ops.object.delete()
51
52# Set the level of the fractal
53level = 6 # Increase the fractal level to 6
54size = 2.0
55location = mathutils.Vector((0, 0, 0))
56
57pyramids = sierpinski_pyramid(level, size, location)
58
59# Set the start and end frame for the animation
60start_frame = 10
61end_frame = 300
62
63# Animate each pyramid individually
64for pyramid in pyramids:
65 bpy.context.view_layer.objects.active = pyramid
66 bpy.ops.object.origin_set(type='ORIGIN_CENTER_OF_MASS', center='BOUNDS')
67
68 # Animate the whole pyramid
69 original_location = pyramid.location.copy()
70 pyramid.location.z = 0
71 pyramid.keyframe_insert(data_path="location", index=2, frame=start_frame)
72 pyramid.location = original_location
73 pyramid.keyframe_insert(data_path="location", index=2, frame=end_frame)
74
75 # Animate individual vertices
76 mesh = pyramid.data
77 for vertex in mesh.vertices:
78 original_z = vertex.co.z
79 vertex.co.z = 0
80 vertex.keyframe_insert(data_path="co", index=2, frame=start_frame)
81 vertex.co.z = original_z
82 vertex.keyframe_insert(data_path="co", index=2, frame=end_frame)
83
84bpy.context.scene.frame_end = end_frame
Again after some troubleshooting I was able to get a code that created the scenes below. To recreate this scene or similar with GPT, I would recommend using the prompt below to start.
Prompt: Consider yourself an expert Python developer working with Blender (bpy). I need a single script that does the following:
Please provide the complete Blender Python script (using `bpy`, `bmesh`, `random`, `mathutils`, etc.) that fulfills these requirements. Make sure to keyframe both the emissive color changes and the Z-axis noise-based movement for all three circles of spheres.
1import bpy
2import colorsys
3import math
4import bmesh
5import random
6import mathutils
7
8# set frames and fps
9bpy.context.scene.render.fps = 30
10bpy.context.scene.frame_end = 450
11
12# "noise" scale
13NOISE_SCALE = 0.3
14
15# function to create a circle with points and colors based on harmony
16def create_colored_circle(name, points, radius, location, harmonies):
17 # create mesh and object
18 mesh = bpy.data.meshes.new(name)
19 obj = bpy.data.objects.new(name, mesh)
20
21 # link object to scene
22 bpy.context.collection.objects.link(obj)
23
24 # create bmesh
25 bm = bmesh.new()
26
27 spheres = []
28 materials = []
29 for i in range(points):
30 # calculate position
31 angle = 2.0 * math.pi * (i / points)
32 pos = [radius * math.cos(angle), radius * math.sin(angle), 0]
33
34 # create vertex
35 v = bm.verts.new(pos)
36
37 # create material
38 mat = bpy.data.materials.new(name=name + str(i))
39 mat.use_nodes = True
40 mat.node_tree.nodes.remove(mat.node_tree.nodes.get('Principled BSDF'))
41
42 emmision_node = mat.node_tree.nodes.new(type='ShaderNodeEmission')
43 emmision_node.inputs[1].default_value = 1.0 # Strength
44
45 output_node = mat.node_tree.nodes.get('Material Output')
46 mat.node_tree.links.new(emmision_node.outputs[0], output_node.inputs[0])
47
48 materials.append(mat)
49
50 # create sphere
51 bpy.ops.mesh.primitive_uv_sphere_add(radius=0.05, location=pos)
52 sphere = bpy.context.object
53 sphere.data.materials.append(mat)
54 spheres.append(sphere)
55
56 # animate material colors
57 for frame in range(bpy.context.scene.frame_end + 1):
58 for i, mat in enumerate(materials):
59 t = frame / bpy.context.scene.frame_end
60 harmony = harmonies[i % len(harmonies)]
61 hue = (t + harmony) % 1.0
62 color = colorsys.hsv_to_rgb(hue, 1.0, 1.0)
63 animate_material(mat, frame, color)
64
65 # animate sphere positions
66 for frame in range(bpy.context.scene.frame_end + 1):
67 for i, sphere in enumerate(spheres):
68 t = frame / bpy.context.scene.frame_end
69 sphere.location.z = mathutils.noise.noise(t + i / points) * NOISE_SCALE
70 sphere.keyframe_insert(data_path="location", frame=frame)
71
72 # update mesh
73 bm.to_mesh(mesh)
74 bm.free()
75
76 # move circle to location
77 obj.location = location
78
79# create circles
80create_colored_circle('circle1', 36, 1.0, [0, 0, 0], [i / 36 for i in range(36)]) # all colors
81create_colored_circle('circle2', 20, 0.7, [0, 0, 1], [0, 0.25, 0.5, 0.75]) # double-split complementary
82create_colored_circle('circle3', 12, 0.49, [0, 0, 2], [0, 0.33, 0.67]) # square
83
84# refresh scene
85bpy.context.view_layer.update()
I found an AI interpolation tool called FlowFrames. This tool uses AI to improve and upscale frame rates for videos. It fills in the gaps between frames, stitches them together and renders out a video with a higher framerate as well as the ability to slow down the video to create a slow motion effect. Below is the before and after of the upscaling:
This tool can really help independent animators across any 3D modeling or animation software. In this case while using Blender, I rendered the animation at 15 frames per second at 4096 samples, Intentionally pushing the limits with reflections and lighting.
Next to reduce the choppiness of the animation I used the upscaler and went up to 60 fps. With 32 GB RAM configured on my machine, rendering took about 3 minutes.
Overall render time was 70-75% less than expected using this process to upscale to frame rates up to 120 FPS
After all the experimenting, these were some of my takeaways using AI as a designer/creative:
Enhancing the Toronto Raptors App to deliver a more intuitive, engaging, and user-friendly experience by improving navigation, real-time game tracking, and fan interaction.
The Toronto Raptors mobile app lacked an intuitive and engaging user experience, making it difficult for fans to access real-time game updates, team statistics, and arena information efficiently. Key usability issues included:
My redesign was informed by user research and competitive analysis of leading sports apps to improve usability, accessibility, and fan engagement.
Interviews with Raptors fans revealed major usability issues:
To inform the redesign, I analyzed industry-leading sports apps (e.g., NBA Official, F1) to identify UX improvements for real-time sports tracking and fan engagement:
To solve the usability and engagement challenges identified in research, I followed a user-centered, iterative design process, focusing on navigation simplification, interaction improvements, and branding refinements.
I explored different layout structures to simplify navigation and ensure real-time updates were easily accessible:
To align the redesign with the Raptors’ We The North identity, we incorporated logo, colors, typography, and mood board inspiration to ensure a cohesive and visually compelling experience:
The primary Raptors logo and emblem were integrated throughout the UI, reinforcing team identity and maintaining a strong visual connection to the brand.
The official Raptors color palette was applied strategically across the interface to ensure high contrast and strong visual impact.
RED
PANTONE:
PMS 200 C
HEX COLOR:
#CE1141
RGB:
(206, 17, 65)
CMYK:
(0, 100, 65, 15)
BLACK
PANTONE:
PMS BLACK 6C
HEX COLOR:
#000000
RGB:
(6, 25, 34)
CMYK:
(30, 0, 0, 100)
SILVER
PANTONE:
PMS COOL GRAY 8C
HEX COLOR:
#A1A1A4
RGB:
(161, 161, 164)
CMYK:
(0, 1, 0, 43)
GOLD
PANTONE:
PMS 872 C
HEX COLOR:
#B4975A
RGB:
(180, 151, 90)
CMYK:
(20, 30, 70, 15)
Millionaire as the primary signature font for branding elements, reinforcing the bold and dynamic Raptors aesthetic.
Roboto was selected for UI readability, ensuring a modern, clean, and legible text experience for stats, menus, and real-time updates.
The secondary signature font is Chinese Rocks. This style was used to reinforce the bold, rebellious, and distinctive branding of the Toronto Raptor's slogan, "We the North".
The mood board guided the visual direction, drawing inspiration from urban Toronto culture, the energy of the city, and the electricity of live games.
The goal of the final UI was to reflect a sleek, high-energy aesthetic while maintaining usability and accessibility.
Below are some mockups of the widgets on the home screen:
As new breakthroughs are made in the world of brain-computer interactions and neuroscience, the way we will interface with cyberspaces is going to change completely. Interfaces have been developed based on an aggregate of user behaviours. Everyone thinks and communicates differently but as a result we’re still required to use an aggregate-based interface archetype. However, as we better understand how our brains work, so should our devices.
At present we interface the cloud by way of "data mediums" , i.e. our daily devices, that we interact with and then communicates with the cloud to come back to us with a response. As we see devices becoming closer and more connected to us, the way we interact with the cloud will also change.
Instead of using devices to interact with the cloud, we may eventually be able to interface directly through our cognition. The resulting service will need to operate and provide responses to users proactively and with intrinsic value in order to keep up to speed.
The above diagram maps out all current services that provide users with a proactive service that utilises intrinsic data. The value opportunity here is a proactive digital assistant that provides intrinsically relevant responses to user queries.
Personality, temperament, moods, and thoughts are all factors that influence each other over time but at the core of it is the thought process. The diagram below is a representation of approach versus affect. Until technologies are capable enough of understanding human thought, one way to start unpacking such a complicated system is by starting from the outside in, by beginning with key personality traits.
There are many different techniques that are employed to determine personality traits. One such method is by using the Myers-Briggs Type Indicator (MBTI). It categorises personalities based on the following four spectrums:
Where does one get their energy?
- Introversion (I) vs Extroversion (E)
How do they take in information?
-Sensing (S) vs Intuition (N)
How do they base their decisions?
- Thinking (T) vs Feeling (F)
How do they organise their world?
-Judging or structure (J) vs Perceiving or flexibility (P)
Depending on the initials of your personality type, they can be further categorised under either Sentinels, Analysts, Explorers or Diplomats.
Sentinels are helpful and exceptionally pragmatic, embracing and creating order, security and stability wherever they go. Individuals in this category are generally persevering, meticulous and traditional, They succeed in logistical or administrative fields, particularly those that depend on clear processes and rules. These character types like to adhere to their plans but doesn't avoid taking on challenges – however, they can be inflexible and reluctant to accept different points of view.
Diplomats prioritize empathy and cooperation. They do really well in areas of diplomacy and counseling. They are imaginative and often taking on the responsibility of being the peacekeeper in their workplace or social circles. Diplomats are warm, empathic and influential individuals, but can struggle on making decisions based on rationality in tough situations.
These personality types are inclined to rationality and impartiality, excelling in fields of science, technology and intellectual debate. Other character traits include being independent, open-minded, strong-willed and imaginative as well as approaching situations from a utilitarian perspective. Their priorities may often lie in what works than what satisfies everybody. These traits make Analysts excellent strategic thinkers.
Explorers are utilitarian, practical, and excel in thinking on their feet. They are masters of craft and techniques, utilizing them from multiple points of view – going from dominating these skills to pursuing others. These character types are indispensable in emergencies, craft, and sales. Their qualities can also push them towards taking on risky challenges or zeroing in exclusively on sensory experiences.
First, it is an Infrastructure as a Service (IaaS) which consists of a virtual private cloud that allows for secure private storage of data as well as running cloud-based applications, typically a service offered only to enterprise customers. Although given the growth of cloud storage demand amongst users over the last few years, companies should prepare to provide enterprise level cloud services for everyday users in coming years.
And secondly, it is a Software as a Service (SaaS) which is an interface by way of an advanced virtual assistant which is modeled to respond and access information based on the user’s contextual psychographic data as well as indicators in personality. As a result, creating interactions that are synthesized through digital empathy for the user.
The following is a Value As Delivery model which maps out the value being delivered, revenue, costs, key partners and participants in the platform.
What is the potential market value of the service to a web company like Amazon?
As of 2021, Amazon reported over 146. 8 million Prime Members.
By multiplying the annual service cost into the current user base,
Amazon's potential market value for this service is valued at
To begin with, key trends were drawn out from an industry assessment with regards to the overarching theme of Augmented Life:
Companies still using traditional marketing methods are being left in the past as others invest in "direct to consumer" product experiences.
There are a variety of AR and VR technologies being developed to take over the computing, phone and tv technologies.
Successful brands are now considering how designed experiences and brand perception can influence customer opinions.
As users call for wider accessibility and user empathy, interfaces have become smoother and more efficient.
Using Augmented Life as a starting point, associated opportunities were mapped out in a brainstorming session:
Following the brainstorming session, interviews were conducted to understand what makes an experience valuable and/or memorable. Insights were mapped out based on positive or negative sentiments versus memorable or non-memorable moments gave us insight into where the opportunity lies.
As we realise the permanence of the digital age, the value of digital legacy is becoming more apparent. Addressing the building blocks of a legacy can also determine what defines a valuable moment.
Understanding Instagram's current vision model was key in developing the Augmented vision model.
The Memoir page provides the user a more personalized and in-depth understanding of the historical and emotional value connected to the assets. As a result, this provides the assets to be part of not only a physical realm but also to be part of an emotional one. The connection between the memoirs and assets, will create a more valuable experience for the user, and give them the initiative to continue and care for the legacy of benefactor.
The financial Doula is a service that is provided through a digital platform, it gives the user the freedom to take the next steps at their own pace and time. It is noted that millennials and younger generations prefer quick access’ to information therefore a digital application would meet these needs. The Doula is also an educational resource that provides a financial glossary to aid the user in furthering their understanding of the process.
The Serene space provides baby boomers a physical platform to easily exchange their legacy. It has been noted that older generations like baby boomers prefer face-to-face interaction when it comes to their finances. Therefore, Serene brings two generations together in a peaceful and welcoming environment to openly converse about the beginning or continuation of a legacy.
Insights were synthesized from interviews by developing an experience map of a habitual smoker:
Based on interviews, there were some consistent pain points among users:
Users want to know the ins and outs of their payments because it helps them organize and budget their life.
Users clearly recognize a market that is getting saturated. It is becoming a cause for concern as these are costs that accumulate.
Over subscribing has become a big issue in the subscription industry and people and losing countless dollars for something they aren't using.
People want to be able to not only share subscriptions with family, but with friends as well through shared payments. No more “pay me back” or risk one friend footing the bill.
Key themes that were of value to specific demographics :
While most users browse freely, others are more decisive and like to maintain accountability.
Subscription packages are more approachable at student, family, and shared rates.
Users have trouble juggling multiple subscriptions and often contemplate if they are worth the cost.
Single subscription is being shared widely amongst friends and family members.
Insights were further mapped to understand what the user says, thinks, does, and feels.
The following were the service features that were developed from the research:
After prototyping, feedback provided some valuable critique:
Have an interesting project that you'd like to collaborate on? I'm always excited to work with others. Feel free to reach out for any feedback, questions, opportunities or even just to say hello.