Public:Personal Shopping Assistant: Difference between revisions

From Docs | Smarter.Codes
No edit summary
No edit summary
 
(4 intermediate revisions by the same user not shown)
Line 30: Line 30:
== Scan Products ==
== Scan Products ==


=== ..with Barcode Camera ===
=== with Barcode Camera ===


=== ..with Ingredients Camera ===
=== with Ingredients Camera ===
Barcode scanning might be inefficient when
Barcode scanning might be inefficient when


Line 40: Line 40:
So you scan the ingredients and know about whether you should buy the product or no
So you scan the ingredients and know about whether you should buy the product or no


=== ..with Browser Extension ===
=== with Browser Extension ===
 
=== with Phone Camera ===
so that unpackaged foods can also be scanned<ref>[https://support.openfoodfacts.org/help/fr-fr/12-donnees-api/62-qu-en-est-il-des-aliments-sans-code-barres OpenFoodFacts.org - What about food without barcodes?]</ref>


== Lens ==
== Lens ==
Consumers see world with their perspective - with their lens. You should be able to put on the lens of WHO, or the lens of Gordan Ramsey. Based on what lens you apply you will see recommendations for products.
Consumers see world with their perspective<ref>[https://www.youtube.com/shorts/mX2HD3YjrFs Yuka App referred as "fear mongering app" by a Skin Specialist]</ref> - with their lens. You should be able to put on the lens of WHO, or the lens of Gordan Ramsey. Based on what lens you apply you will see recommendations for products.


== Facts Database ==
== Facts Database ==
Line 51: Line 54:
=== about Ingredients ===
=== about Ingredients ===


=== about Product Categories ===
== Grader ==
User "scans" a product, puts on a "lens" which makes our app to use the "facts" database of the lens. Our UI shows Grading of the product - a scope from 0 to 100.
== Product Recommendation ==
For every scan our app recommends products. These recommendations can be
=== ..more sustainable products ===
=== ..people also buy ===
== Facts Training Interface ==
We would let user use a WikiData like interface create
=== Item for Ingredient/Company/Product-Category ===
= Weekly Releases =
== Week A ==
[[#Facts Database]] is loaded with ingredients we built during the times of https://talkingdb.io/natural-language-ingredient . For every Ingredient we've a [[:en:Wikidata#Concept|concept]]
[[#Facts Training Interface]]
* lets user CRUD the Synonyms of the Ingredients. Using Alias feature in Wikidata
* lets user CRUD the Nutrition Score of the ingredient. Using Property of Concept in Wikidata
[[#Scan Products]] [[#with Ingredients Camera]]
[[#Grader]] is used to show a score between 0 to 100 for ingredient scanned
== Week B - WIP ==
[[#Facts Training Interface]]
* lets users CRUD the Lens of the Nutrition score. Using Reference as [[#Lens]]


== Week C - Public release ==


= References =
[[Category:Markets of SC]]
[[Category:Markets of SC]]

Latest revision as of 17:03, 12 March 2024

Market Sizing

Who, What, Why

Who: Shoppers who shop online

What:

  • Like W3 Store Assistant, with the exception that we are interested in our shopper, rather than the brand. So we recommend shoppers across stores so that they have the best deal
  • Our chatbot overlays on top of ecommerce stores, and assists them in shopping by offering Conversational User Experience of these in-store Graphical User Experiences

Why: ?

Narrowing Focus

Personal Shopping Assistant/for Food and Grocery/in India/Vernacular languages

Personal Shopping Assistant/for Apparel/in India/Vernacular languages

Personal Shopping Assistant/for Sustainability

Personal Shopping Assistant/for Vitamins and Supplements

Competitors

https://greylock.com/portfolio/#curated yuka.io

openfoodfacts.org

Capabilities

Scan Products

with Barcode Camera

with Ingredients Camera

Barcode scanning might be inefficient when

  • Barcode is not stored in the database
  • Same barcode is being used by multiple products by some producers

So you scan the ingredients and know about whether you should buy the product or no

with Browser Extension

with Phone Camera

so that unpackaged foods can also be scanned[1]

Lens

Consumers see world with their perspective[2] - with their lens. You should be able to put on the lens of WHO, or the lens of Gordan Ramsey. Based on what lens you apply you will see recommendations for products.

Facts Database

about Companies

about Ingredients

about Product Categories

Grader

User "scans" a product, puts on a "lens" which makes our app to use the "facts" database of the lens. Our UI shows Grading of the product - a scope from 0 to 100.

Product Recommendation

For every scan our app recommends products. These recommendations can be

..more sustainable products

..people also buy

Facts Training Interface

We would let user use a WikiData like interface create

Item for Ingredient/Company/Product-Category

Weekly Releases

Week A

#Facts Database is loaded with ingredients we built during the times of https://talkingdb.io/natural-language-ingredient . For every Ingredient we've a concept

#Facts Training Interface

  • lets user CRUD the Synonyms of the Ingredients. Using Alias feature in Wikidata
  • lets user CRUD the Nutrition Score of the ingredient. Using Property of Concept in Wikidata

#Scan Products #with Ingredients Camera

#Grader is used to show a score between 0 to 100 for ingredient scanned

Week B - WIP

#Facts Training Interface

  • lets users CRUD the Lens of the Nutrition score. Using Reference as #Lens

Week C - Public release

References