The word originally entered English as an adaptation of the Portuguese albricoque or Spanish albaricoque.
However, it was subsequently changed to match the related French word abricot (where the 't' is silent).
It is also useful to compare this to the Italian albercocca or albicocca and the Old Spanish albarcoque.
These all stem from the Spanish-Arabic al-borcoque, which itself comes from the Arabic al-burqūq (literally "the" + "birqūq").
This Arabic term was adapted from Greek, appearing in the writings of Dioscorides around the year 100 AD.
The Greek word was probably adapted from the Latin præcoquum, a variant of præcox (plural præcocia), which translates to "early-ripe" or "ripe in summer."
In earlier Roman times, the fruit was actually called the "Armenian plum" or "Armenian apple."
By around the year 350, the writer Palladius was using both terms, referring to them as "Armenian or early-ripe" fruits.
The reason we use a "p" in English (apricot) instead of a "b" (abricot) is likely due to a mistake in etymology.
In 1617, the scholar Minsheu explained the name as if it meant in aprico coctus, or "cooked in a sunny place."
This "sunny" explanation stuck, even though it was technically incorrect!
In the theory of the psychology of creativity, there are phenomena which constitute distortions of the motivational setting for creative problem-solving which are referred to as 'extrinsic rewards'. Management theory bumped into this kind of phenomenon with the advent of the introduction of the first appearance of 'gamification' as a motivational toolkit, where 'scores' and 'badges' were awarded to participants in online activities. The psychological community reacted to this by pointing out that earlier research had shown that whilst extrinsics can indeed (at least initially) boost participation by introducing notions of competitiveness, it turned out that they were ultimately poor substitutes for the far more sustainable and productive intrinsic motivational factors, like curiosity, if it could be stimulated effectively (something which itself inevitably required more creativity on the part of the designer of the motivational resources). It seems that the motivational analogue in inference engines is an extrinsic reward process.
Says anyone who has seen them at work. They very obviously do not possess intelligence with how often they fall over at basic tests that would never trip up a human. For example, the "how many r's are in the word 'strawberry'" test. No person who is literate in English would fail this, but LLMs do (or did, until the companies making them put in a kludge because they were embarrassed by how it revealed the stupidity of their models).
Vernor Vinge has a story line where humans build their own portable chess computers and utilize them as assistants in human chess matches.
I still think this would be kinda cool. I could see a tournament providing the power source in addition to the chess clock. Then gamesmanship where you play moves you hope are expensive for the opponent but not for your own AI.
Honestly AI is a trick to make us buy new expensive computers. I'm writing this from over 10 years old one and the computers offered in a leaflet from nearby electronic store aren't much better.
Anyone who remembers the 90s and 2000s, where your computer hardware was out of date within months, might disagree. If you want to do bleeding edge things like running 70b+ LLMs locally or doing training, you need bleeding edge hardware. No different than if you want to play the newest AAA games. There are plenty of games you can play with old hardware, and plenty of small LLMs. When you can use ChatGPT or a bunch of other services, it isn’t a trick that some people want to host their own or do training, but you need a system that can do that.
I mean, gaming is the big pusher of new hardware these days, and web is basically the reason you can use a 90s computer in the modern day. I happily survived on roughly 10 year old components all the way through university because I wasn't playing AAA games
My parents bought a new laptop for their general household use and to watch YouTube via HDMI on their tv. It was so annoying and weird and not even fast, that they returned it to Costco for the $800 within 90 days.
I setup a 10 year old computer for them instead running Linux Mint Mate and it's perfect.
I think that the nature of the thing that was needing to be implanted would determine whether it was easier and safer to use a traditional incision. Hip replacements for instance use a titanium joint. However, seeing as the bone that it is replacing was not made of titanium, somehow nature has found a way to make bone strong enough to do the job. There is an important sense in which nature 'builds things using 3D printing' and does it without needing incisions, so even if we do manage to master doing 3D printing under the skin the way we do it now to an extent where we can replace things that currently need materials that are not suited to under skin 3D printing...
Even as just a meme, no-hole surgery and the technology that makes it possible may have sufficient potential to perform a valid service to medical research simply by sparking the imagination of students: reduced invasiveness is a time-honoured aim in surgical procedures, so taking it to the next (or ultimate) level does look like it's worth encouraging: a key question to some extent becomes one of language: do we still call it surgery when nothing constituting an incision is involved? Yes, we make a (very small) hole when we inject the 'ink', but injections are never referred to as being surgery. So if no-hole surgery is not a type of surgery, then what the heck kind of thing should we call it?
I've seen what appear to me to be far less deserving papers get the same kind of perhaps excessively exuberant hype, but in this case, it's not looking quite that far off the mark, IMO
The word originally entered English as an adaptation of the Portuguese albricoque or Spanish albaricoque.
However, it was subsequently changed to match the related French word abricot (where the 't' is silent).
It is also useful to compare this to the Italian albercocca or albicocca and the Old Spanish albarcoque.
These all stem from the Spanish-Arabic al-borcoque, which itself comes from the Arabic al-burqūq (literally "the" + "birqūq").
This Arabic term was adapted from Greek, appearing in the writings of Dioscorides around the year 100 AD.
The Greek word was probably adapted from the Latin præcoquum, a variant of præcox (plural præcocia), which translates to "early-ripe" or "ripe in summer."
In earlier Roman times, the fruit was actually called the "Armenian plum" or "Armenian apple."
By around the year 350, the writer Palladius was using both terms, referring to them as "Armenian or early-ripe" fruits.
The reason we use a "p" in English (apricot) instead of a "b" (abricot) is likely due to a mistake in etymology.
In 1617, the scholar Minsheu explained the name as if it meant in aprico coctus, or "cooked in a sunny place."
This "sunny" explanation stuck, even though it was technically incorrect!