MyPalette

MyPalette

MyPalette is an application that helps users determine the right color combination for their clothes based on personal color preferences and current fashion trends. This application is designed to ease the process of choosing clothes colors which is often a daily challenge, especially for those who want to look fashionable but feel confused in choosing the right color combination. 

Year

2024

Category

iOS Application

Live Project

Visit Site

Research

Research

The development of MyPalette was inspired by user research focused on the challenges many individuals face when selecting clothing colors that suit their personal preferences and current fashion trends. Surveys and user interviews revealed that people often struggle with coordinating outfits, feeling overwhelmed by the variety of colors available. Additionally, users expressed a desire for personalized fashion recommendations, particularly ones that could help them stay up-to-date with current trends. Through this research, it became clear that augmented reality (AR) could offer a unique solution by allowing users to visualize color combinations on their clothes in real time, making the process of selecting fashionable outfits more intuitive and enjoyable. The goal of MyPalette is to provide a simple yet effective tool to streamline the decision-making process, enhance personal style, and help users feel confident in their wardrobe choices.

The development of MyPalette was inspired by user research focused on the challenges many individuals face when selecting clothing colors that suit their personal preferences and current fashion trends. Surveys and user interviews revealed that people often struggle with coordinating outfits, feeling overwhelmed by the variety of colors available. Additionally, users expressed a desire for personalized fashion recommendations, particularly ones that could help them stay up-to-date with current trends. Through this research, it became clear that augmented reality (AR) could offer a unique solution by allowing users to visualize color combinations on their clothes in real time, making the process of selecting fashionable outfits more intuitive and enjoyable. The goal of MyPalette is to provide a simple yet effective tool to streamline the decision-making process, enhance personal style, and help users feel confident in their wardrobe choices.

Research

The development of MyPalette was inspired by user research focused on the challenges many individuals face when selecting clothing colors that suit their personal preferences and current fashion trends. Surveys and user interviews revealed that people often struggle with coordinating outfits, feeling overwhelmed by the variety of colors available. Additionally, users expressed a desire for personalized fashion recommendations, particularly ones that could help them stay up-to-date with current trends. Through this research, it became clear that augmented reality (AR) could offer a unique solution by allowing users to visualize color combinations on their clothes in real time, making the process of selecting fashionable outfits more intuitive and enjoyable. The goal of MyPalette is to provide a simple yet effective tool to streamline the decision-making process, enhance personal style, and help users feel confident in their wardrobe choices.

Development

Development

MyPalette was developed to offer an innovative and personalized approach to choosing clothing colors by integrating augmented reality (AR) and gesture recognition. The app uses ARKit and SceneKit to create a seamless experience where users can view clothing color combinations in real-time, superimposed on their own clothes using their phone’s camera. The Vision Framework is employed to detect and track clothing in the real world, allowing the app to suggest color combinations based on the user’s preferences and current fashion trends. SwiftUI was used to create the app’s user interface, ensuring it was both intuitive and responsive across different devices. The development process focused on optimizing performance and providing users with an immersive, interactive experience. Gesture recognition allows users to swipe, pinch, and interact with the virtual clothing combinations, making the experience engaging and dynamic. The app also integrates third-party frameworks to enhance its functionality, such as providing access to fashion trend data and expanding its color palette recommendations.

MyPalette was developed to offer an innovative and personalized approach to choosing clothing colors by integrating augmented reality (AR) and gesture recognition. The app uses ARKit and SceneKit to create a seamless experience where users can view clothing color combinations in real-time, superimposed on their own clothes using their phone’s camera. The Vision Framework is employed to detect and track clothing in the real world, allowing the app to suggest color combinations based on the user’s preferences and current fashion trends. SwiftUI was used to create the app’s user interface, ensuring it was both intuitive and responsive across different devices. The development process focused on optimizing performance and providing users with an immersive, interactive experience. Gesture recognition allows users to swipe, pinch, and interact with the virtual clothing combinations, making the experience engaging and dynamic. The app also integrates third-party frameworks to enhance its functionality, such as providing access to fashion trend data and expanding its color palette recommendations.

Development

MyPalette was developed to offer an innovative and personalized approach to choosing clothing colors by integrating augmented reality (AR) and gesture recognition. The app uses ARKit and SceneKit to create a seamless experience where users can view clothing color combinations in real-time, superimposed on their own clothes using their phone’s camera. The Vision Framework is employed to detect and track clothing in the real world, allowing the app to suggest color combinations based on the user’s preferences and current fashion trends. SwiftUI was used to create the app’s user interface, ensuring it was both intuitive and responsive across different devices. The development process focused on optimizing performance and providing users with an immersive, interactive experience. Gesture recognition allows users to swipe, pinch, and interact with the virtual clothing combinations, making the experience engaging and dynamic. The app also integrates third-party frameworks to enhance its functionality, such as providing access to fashion trend data and expanding its color palette recommendations.

Tech Stack

Tech Stack

MyPalette utilizes a combination of cutting-edge technologies to create an interactive, AR-powered clothing recommendation tool. The app is built with ARKit for augmented reality experiences, allowing users to view and interact with clothing color combinations in real-time. SceneKit is used for 3D rendering, ensuring that the virtual clothing colors are seamlessly superimposed onto real-world garments, providing users with an accurate preview of how different colors would look on them. The Vision Framework plays a key role in detecting and tracking the user’s clothing, enabling the app to intelligently recommend color combinations based on what it sees. SwiftUI powers the app's user interface, offering a clean and intuitive design that works smoothly across multiple devices. Gesture recognition, powered by iOS's built-in capabilities, allows users to interact with the app’s virtual elements using simple touch gestures. Third-party framework integrations provide additional functionality, such as trend analysis and color suggestions, ensuring that users stay on top of the latest fashion styles.

MyPalette utilizes a combination of cutting-edge technologies to create an interactive, AR-powered clothing recommendation tool. The app is built with ARKit for augmented reality experiences, allowing users to view and interact with clothing color combinations in real-time. SceneKit is used for 3D rendering, ensuring that the virtual clothing colors are seamlessly superimposed onto real-world garments, providing users with an accurate preview of how different colors would look on them. The Vision Framework plays a key role in detecting and tracking the user’s clothing, enabling the app to intelligently recommend color combinations based on what it sees. SwiftUI powers the app's user interface, offering a clean and intuitive design that works smoothly across multiple devices. Gesture recognition, powered by iOS's built-in capabilities, allows users to interact with the app’s virtual elements using simple touch gestures. Third-party framework integrations provide additional functionality, such as trend analysis and color suggestions, ensuring that users stay on top of the latest fashion styles.

Tech Stack

MyPalette utilizes a combination of cutting-edge technologies to create an interactive, AR-powered clothing recommendation tool. The app is built with ARKit for augmented reality experiences, allowing users to view and interact with clothing color combinations in real-time. SceneKit is used for 3D rendering, ensuring that the virtual clothing colors are seamlessly superimposed onto real-world garments, providing users with an accurate preview of how different colors would look on them. The Vision Framework plays a key role in detecting and tracking the user’s clothing, enabling the app to intelligently recommend color combinations based on what it sees. SwiftUI powers the app's user interface, offering a clean and intuitive design that works smoothly across multiple devices. Gesture recognition, powered by iOS's built-in capabilities, allows users to interact with the app’s virtual elements using simple touch gestures. Third-party framework integrations provide additional functionality, such as trend analysis and color suggestions, ensuring that users stay on top of the latest fashion styles.

Thank you

Thank you

Thank you

Thank you

musawahaidar123@gmail.com

GO BACK TO TOP

musawahaidar123@gmail.com

GO BACK TO TOP