zappy

joined 1 year ago
MODERATOR OF
[–] [email protected] 0 points 11 months ago (1 children)

50% of people earn less than the average hourly wage

[–] [email protected] 1 points 1 year ago

That's kind of the point and how's it different than a human. A human is going to weight local/recent contextual information as much more relevant to the conversation because they're actively learning and storing the information (our brains work on more of an associative memory basis than temporal). However, with our current models it's simulated by decaying weights over the data stream. So when you get conflicts between contextual correct vs "global" correct output, global has a tendency to win out that is more obvious. Remember you can't actually make changes to the model as a user without active learning. Thus the model will always eventually return to it's original behaviour as long as you can fill up the memory.

[–] [email protected] 2 points 1 year ago (2 children)

I'm trying to tell you limited context is a feature not a bug, even other bots do the same thing like Replika. Even when all past data is stored serverside and available, it won't matter because you need to reduce the weighting or you prevent significant change in output values (and less change as the history grows larger). Time decay of information is important to making these systems useful.

[–] [email protected] 7 points 1 year ago (1 children)

Everyone rushing to over-supply the market was not a lack of demand. Legal weed is probably the largest new market in the last 5 years! Saying the company is underperforming because of a lack of demand when people are buying from other companies is such a hollow excuse.

[–] [email protected] 5 points 1 year ago (2 children)

Oh boy you really need to read up on current events and history if you think Canada is some kind of femenist utopia

[–] [email protected] 6 points 1 year ago (3 children)

The cannabis industry has been hampered by a lack of demand...

What? That I do not believe

[–] [email protected] 3 points 1 year ago

I hear this from Americans a lot, here everything is pretty much online nowadays (although a friend of mine had her identity stolen so she has to get in person which is her biggest complaint about the whole thing)

[–] [email protected] 1 points 1 year ago* (last edited 1 year ago)

At that point why not just embed the gps tags in the ear tags that we already put on cows? Or why can't we just spray paint their butts like sheep? (Which I'm saying as a person that really knows nothing about this but if it works for the sheep...)

[–] [email protected] 1 points 1 year ago (2 children)

Then the thieves are just going to cut off all the cows' ears!

[–] [email protected] 3 points 1 year ago (4 children)

The problem isn't the memory capacity, even thought the LLM can store the information, it's about prioritization/weighting. For example, if I tell chatgpt not to include a word (for example apple) in it's responses then ask it some questions then ask it a question about what are popular fruit-based pies then it will tend to pick the "better" answer of including apple pie rather than the rule I gave it a while ago about not using the word apple. We do want decaying weights on memory because most of the time old information isn't as relevant but it's one of those things that needs optimization. Imo I think we're going to get to the point where the optimal parameters for maximizing "usefullness" to the average user is different enough from what's needed to pass someone intentionally testing the AI. Mostly bc we know from other AI (like Siri) that people don't actually need that much context saved to find them helpful

[–] [email protected] 2 points 1 year ago

You don't get to complain about people being condescending to you when you are going around literally copy and pasting wikipedia. Also you're not right, major progress in this field started in the 80s although the concepts were published earlier, they were basically ignored by researchers. You're making it sound like the NNs we're using now are the same as the 60s when in reality our architectures and just even how we approach the problem have changed significantly. It's not until the 90s-00s that we started getting decent results that could even match older ML techniques like SVM or kNN.

[–] [email protected] 4 points 1 year ago* (last edited 1 year ago)

I'm surprised no one has said london drugs https://www.londondrugs.com/about-london-drugs/about-us.html that is the closest to amazon imo. shoppersdrugmart.ca and fortinos.ca are owned by Loblaws and have an expanded marketplace selection.

Costco.ca is an american company but publicly traded. Well.ca used to be canadian but is now owned by McKesson Corporation which is American and publicly traded.

You could try to buy through etsy or look for canadian shopify stores like bee kinds https://beekindwraps.ca/ ( here is a complete list ) which are more specific sites. You can often find the same products on marketplace sites but the merchant will pay less fees if you go through their own site so try searching products too

view more: next ›