Can always trust ai to use the right words to say all the wrong things

Mar 7, 2025 4:20 PM

RoninTactican

Views

1708

Likes

33

Dislikes

4

tech

fail

ai

I did cut it twice and it is still too small

6 months ago | Likes 1 Dislikes 0

I'm no expert, but ...

6 months ago | Likes 2 Dislikes 0

Use a matchstick to make the hole tighter. If the matchstick is too big split it with a chisel.

6 months ago | Likes 2 Dislikes 0

Should have studied Geometric Dimensioning and Tolerancing (GD&T), "maximum material condition" (MMC) and "least material condition" (LMC)

6 months ago | Likes 1 Dislikes 0

By sanding they CLEARLY mean making sand stick to it, like it would stick to your feet as you come out of the water. SMH. Are you a millennial or some other work-averse generation? /s

6 months ago | Likes 3 Dislikes 0

Sand. The dowel.

Wow.

6 months ago | Likes 8 Dislikes 0

You gotta soak it in wood first to fortify it with extra layers, THEN you sand the dowel after to the right size.

6 months ago | Likes 2 Dislikes 0

Only if you want to make it thicker.

6 months ago | Likes 5 Dislikes 0

It would work if you sand it in the opposite direction

6 months ago | Likes 2 Dislikes 0

Soak it in wood.

6 months ago | Likes 6 Dislikes 0

Try typing "dowel hole is too fucking big" instead. It will skip the AI "answer"

6 months ago | Likes 3 Dislikes 0

Yes that would work!

6 months ago | Likes 1 Dislikes 0

I'm feeling lucky

6 months ago | Likes 2 Dislikes 0

I kind of want to just to see but I also don't want to see.

6 months ago | Likes 1 Dislikes 0

Or get a bigger dowel?

6 months ago | Likes 3 Dislikes 0

SAND.IT!

6 months ago | Likes 3 Dislikes 0

It clearly says another alternative is using a SMALLER dowel.

6 months ago | Likes 1 Dislikes 0

6 months ago | Likes 1 Dislikes 0

for that you basically need to make your own. The issue I'm having is I used a 3/4 inch bit but the hole it drilled is slightly larger due to wobble of the drill bit (shitty bit) so now the dowels don't fit. They don't make dowels in mm increments (at least in the US) so I've got to either wedge the dowl or hope that some glue and sawdust will save the day.

6 months ago | Likes 2 Dislikes 0

Sand it or use a smaller dowel in some cases. It says it right there 🤪

6 months ago | Likes 2 Dislikes 0

It's easy to make your own if that works (do not rely on glue and sawdust). Having been in your shoes, I suggest a 7/8" dowel, worked (sanded) down to fit, glued in, then redrill if you have to have a 3/4" dowel (I'm a fan of stepbits/unibits to avoid wobble on larger diameter holes. What is your project?

6 months ago | Likes 1 Dislikes 0

Good tips, thanks. I'm re-merging a bisected table top back together. Its just a rough job because its going into my sister's garage work bench and she don't really care or appreciate if its a super neat job so I'm trying not to go over board. I routed both sides of the table to make a rabit joint and then the plan is to lock it in place with glue and 3/4 oak dowels. I did stagger the 3/4 holes so even though they are a little loose the offset might still make it sturdy enough.

6 months ago | Likes 2 Dislikes 0

Nice. Since it's a work bench you might consider attaching a couple mending plates underneath for extra strength. They're cheap and come in any size you want.

6 months ago | Likes 1 Dislikes 0

Remember: AI is just a parrot that repeats things it’s read/heard. It’s slightly more intelligent than a search engine, but not much.

6 months ago | Likes 25 Dislikes 2

I have muted you like I have muted every other anti-AI ninny on this platform.

6 months ago | Likes 2 Dislikes 3

LLMs are literally spicy autocorrect. they work on the level of letters, not words. using them to generate shit is a huge waste of their potential :(

6 months ago | Likes 5 Dislikes 0

I have muted you like I have muted every other anti-AI ninny on this platform.

6 months ago | Likes 2 Dislikes 2

You couldn't be more wrong... Complete words are converted into tokens which then get run through multiple neural networks which have been trained using vast amounts of data, the tokens in the neural network have their own weights to it, and a combination of your collection of tokens and the neural connections and weights in the neural network will calculate the most likely answer. It converting words to tokens is also why it can't answer "How many R's in strawberry" very well, as it can't know.

6 months ago | Likes 1 Dislikes 1

I have muted you like I have muted every other anti-AI ninny on this platform.

6 months ago | Likes 2 Dislikes 3

my friend, lets use this as a teaching moment. https://huggingface.co/blog/getting-started-with-embeddings go get learned. tokenization happens later, and yes, things are treated as larger parts with attention by that layer.

6 months ago | Likes 2 Dislikes 0

I think we might be talking about different things, my comment was more about generative AI's, embeddings are more finetunes to already existing models which your link also states: "The first step is selecting an existing pre-trained model for creating the embeddings.". Also it doesn't state it uses the letters of the word, it even states it uses: "Once a piece of information (a SENTENCE, a document, an image)". tokenization happens after standardization.

6 months ago | Likes 1 Dislikes 0

Really? They do coding surprisingly well.

6 months ago | Likes 4 Dislikes 1

yeah you can add layers of stuff. check out something called a Mixture of Experts and things called Reasoning Models.

6 months ago | Likes 1 Dislikes 0

A lot of modern coding is copy paste
And there's plenty of good data to repeat from

6 months ago | Likes 7 Dislikes 0

Except when it correctly uses code I have written somewhere else in the project.

6 months ago | Likes 1 Dislikes 1