By sanding they CLEARLY mean making sand stick to it, like it would stick to your feet as you come out of the water. SMH. Are you a millennial or some other work-averse generation? /s
for that you basically need to make your own. The issue I'm having is I used a 3/4 inch bit but the hole it drilled is slightly larger due to wobble of the drill bit (shitty bit) so now the dowels don't fit. They don't make dowels in mm increments (at least in the US) so I've got to either wedge the dowl or hope that some glue and sawdust will save the day.
It's easy to make your own if that works (do not rely on glue and sawdust). Having been in your shoes, I suggest a 7/8" dowel, worked (sanded) down to fit, glued in, then redrill if you have to have a 3/4" dowel (I'm a fan of stepbits/unibits to avoid wobble on larger diameter holes. What is your project?
Good tips, thanks. I'm re-merging a bisected table top back together. Its just a rough job because its going into my sister's garage work bench and she don't really care or appreciate if its a super neat job so I'm trying not to go over board. I routed both sides of the table to make a rabit joint and then the plan is to lock it in place with glue and 3/4 oak dowels. I did stagger the 3/4 holes so even though they are a little loose the offset might still make it sturdy enough.
Nice. Since it's a work bench you might consider attaching a couple mending plates underneath for extra strength. They're cheap and come in any size you want.
You couldn't be more wrong... Complete words are converted into tokens which then get run through multiple neural networks which have been trained using vast amounts of data, the tokens in the neural network have their own weights to it, and a combination of your collection of tokens and the neural connections and weights in the neural network will calculate the most likely answer. It converting words to tokens is also why it can't answer "How many R's in strawberry" very well, as it can't know.
I think we might be talking about different things, my comment was more about generative AI's, embeddings are more finetunes to already existing models which your link also states: "The first step is selecting an existing pre-trained model for creating the embeddings.". Also it doesn't state it uses the letters of the word, it even states it uses: "Once a piece of information (a SENTENCE, a document, an image)". tokenization happens after standardization.
OnceBotheredTwiceShy
I did cut it twice and it is still too small
sumthinsumthinsumthin
I'm no expert, but ...
kitskinner19538
Use a matchstick to make the hole tighter. If the matchstick is too big split it with a chisel.
InablueMule
Should have studied Geometric Dimensioning and Tolerancing (GD&T), "maximum material condition" (MMC) and "least material condition" (LMC)
Sumdumguy42
By sanding they CLEARLY mean making sand stick to it, like it would stick to your feet as you come out of the water. SMH. Are you a millennial or some other work-averse generation? /s
IgnisInvictus
Sand. The dowel.
Wow.
TrumpGarglesHorseSemen
You gotta soak it in wood first to fortify it with extra layers, THEN you sand the dowel after to the right size.
witless1
Only if you want to make it thicker.
oddvilla
It would work if you sand it in the opposite direction
wigglemywammybar
Soak it in wood.
Lurker55684
Try typing "dowel hole is too fucking big" instead. It will skip the AI "answer"
InablueMule
Yes that would work!
Eyhlix
I'm feeling lucky
FrozenMojo
I kind of want to just to see but I also don't want to see.
trippingthelightfantastic
Or get a bigger dowel?
andthenthat
SAND.IT!
kisselFL
It clearly says another alternative is using a SMALLER dowel.
trippingthelightfantastic
RoninTactican
for that you basically need to make your own. The issue I'm having is I used a 3/4 inch bit but the hole it drilled is slightly larger due to wobble of the drill bit (shitty bit) so now the dowels don't fit. They don't make dowels in mm increments (at least in the US) so I've got to either wedge the dowl or hope that some glue and sawdust will save the day.
wigglemywammybar
Sand it or use a smaller dowel in some cases. It says it right there 🤪
trippingthelightfantastic
It's easy to make your own if that works (do not rely on glue and sawdust). Having been in your shoes, I suggest a 7/8" dowel, worked (sanded) down to fit, glued in, then redrill if you have to have a 3/4" dowel (I'm a fan of stepbits/unibits to avoid wobble on larger diameter holes. What is your project?
RoninTactican
Good tips, thanks. I'm re-merging a bisected table top back together. Its just a rough job because its going into my sister's garage work bench and she don't really care or appreciate if its a super neat job so I'm trying not to go over board. I routed both sides of the table to make a rabit joint and then the plan is to lock it in place with glue and 3/4 oak dowels. I did stagger the 3/4 holes so even though they are a little loose the offset might still make it sturdy enough.
trippingthelightfantastic
Nice. Since it's a work bench you might consider attaching a couple mending plates underneath for extra strength. They're cheap and come in any size you want.
SF0X
Remember: AI is just a parrot that repeats things it’s read/heard. It’s slightly more intelligent than a search engine, but not much.
Antininny
I have muted you like I have muted every other anti-AI ninny on this platform.
Valkor
LLMs are literally spicy autocorrect. they work on the level of letters, not words. using them to generate shit is a huge waste of their potential :(
Antininny
I have muted you like I have muted every other anti-AI ninny on this platform.
Filolial
You couldn't be more wrong... Complete words are converted into tokens which then get run through multiple neural networks which have been trained using vast amounts of data, the tokens in the neural network have their own weights to it, and a combination of your collection of tokens and the neural connections and weights in the neural network will calculate the most likely answer. It converting words to tokens is also why it can't answer "How many R's in strawberry" very well, as it can't know.
Antininny
I have muted you like I have muted every other anti-AI ninny on this platform.
Valkor
my friend, lets use this as a teaching moment. https://huggingface.co/blog/getting-started-with-embeddings go get learned. tokenization happens later, and yes, things are treated as larger parts with attention by that layer.
Filolial
I think we might be talking about different things, my comment was more about generative AI's, embeddings are more finetunes to already existing models which your link also states: "The first step is selecting an existing pre-trained model for creating the embeddings.". Also it doesn't state it uses the letters of the word, it even states it uses: "Once a piece of information (a SENTENCE, a document, an image)". tokenization happens after standardization.
IrrationalNumber
Really? They do coding surprisingly well.
Valkor
yeah you can add layers of stuff. check out something called a Mixture of Experts and things called Reasoning Models.
stseregh
A lot of modern coding is copy paste
And there's plenty of good data to repeat from
IrrationalNumber
Except when it correctly uses code I have written somewhere else in the project.