• 2 Posts
  • 565 Comments
Joined 1 year ago
cake
Cake day: June 4th, 2023

help-circle





  • I bake quite a bit and I don’t do my mise-en-place either when it comes to baking, but that’s not a problem. The way recipes are formatted works well for my process as well. I read through the steps ahead of time if it’s a recipe I am unfamiliar with, then I’ll just have the ingredients list open while I’m doing the prep. The things I make are pretty basic (cookies, cakes, muffin, etc) and the steps are all identical. Mix wet, mix dry, mix everything, bake.

    I personally find that having less repeated information makes things easier and faster to read. The recipe says “add flour”, you know that it’s all the flour. If the recipe says “add flour (1 cup)”, then I have to check back in the ingredients list to figure out if that’s all the flour or only part of it. Then the more info you add to clarify, the harder it is to skim while you’re cooking.





  • Valid opinion on the phrasing. Disagree with the premise that anything someone says is necessarily their opinion.

    Example: “For me, potatoes are easier to peel with a knife than a potato peeler” vs “Potatoes are easier to peel with a knife than a potato peeler”. The former says that this is my experience and yours may differ. The latter says that this is true in general and if you find it easier the other way, there’s a good chance you’re doing something wrong.








  • LLMs cannot:

    • Tell fact from fiction
    • Accurately recall data from its training set
    • Count

    LLMs can

    • Translate
    • Get the general vibe of a text (sentiment analysis)
    • Generate plausible text

    Semantics aside, they’re very different skills that require different setups to accomplish. Just because counting is an easier task than analysing text for humans, doesn’t mean it’s the same it’s the same for a LLM. You can’t use that as evidence for its inability to do the “harder” tasks.