You are only seeing posts authors requested be public.
Register and Login to participate in discussions with colleagues.
Ars Technica
Google CEO says over 25% of new Google code is generated by AI
On Tuesday, Google's CEO revealed that AI systems now generate more than a quarter of new code for its products, with human programmers overseeing the computer-generated contributions. The statement, made during Google's Q3 2024 earnings call, shows how AI tools are already having a sizable impact on software development.
"We're also using AI internally to improve our coding processes, which is boosting productivity and efficiency," Pichai said during the call. "Today, more than a quarter of all new code at Google is generated by AI, then reviewed and accepted by engineers. This helps our engineers do more and move faster."
Google developers aren't the only programmers using AI to assist with coding tasks. It's difficult to get hard numbers, but according to Stack Overflow's 2024 Developer Survey, over 76 percent of all respondents "are using or are planning to use AI tools in their development process this year," with 62 percent actively using them. A 2023 GitHub survey found that 92 percent of US-based software developers are "already using AI coding tools both in and outside of work."
While ULA studies Vulcan booster anomaly, it’s also investigating fairing issues
A little more than a year ago, a snippet of video that wasn't supposed to go public made its way onto United Launch Alliance's live broadcast of an Atlas V rocket launch carrying three classified surveillance satellites for the US Space Force and the National Reconnaissance Office.
On these types of secretive national security missions, the government typically requests that the launch provider stop providing updates on the ascent into space when the rocket jettisons its two-piece payload fairing a few minutes after launch. And there should be no live video from the rocket released to the public showing the fairing separation sequence, which exposes the payloads to the space environment for the first time.
But the public saw video of the clamshell-like payload fairing falling away from the Atlas V rocket as it fired downrange from Cape Canaveral, Florida, on September 10, 2023. It wasn't pretty. Numerous chunks of material, possibly insulation from the inner wall of the payload shroud's two shells, fell off the fairing. The video embedded below shows the moment of payload fairing jettison.
M2 and M3 MacBook Air models get bumped to 16GB of RAM for no extra money
Apple's week of Mac announcements isn't extending to an M4 MacBook Air—rumors indicate that the Air, as well as desktops like the Mac Studio and Mac Pro, will get new processors sometime in 2025. But Apple is bringing one of the best features of the new M4 Macs to the M3 MacBook Airs, as well as the entry-level M2 model: All of the base models are being bumped from 8GB to 16GB of RAM for the same prices as before. The M2 MacBook Air still starts at $999, while the 13- and 15-inch M3 versions start at $1,099 and $1,299.
All of these laptops were available with 16GB of RAM before, but it was normally a $200 upgrade. All of them still top out at 24GB of RAM, which is now a $200 upgrade to the 16GB models rather than a $400 upgrade as it was before. All models still start with 256GB of storage.
This week's launches mark the first time since 2012 that Apple has increased the amount of RAM in any of its base-model Macs, and the upgrade addresses one of our single biggest complaints about the laptops. Not all users will immediately notice the benefits of a 16GB RAM upgrade, but it will definitely make the laptops more versatile and capable of keeping up with users as their needs change.
Hyundai teases a three-row Ioniq 9 electric SUV
In November, Hyundai will formally unveil its next electric vehicle. It's the new Ioniq 9, a three-row SUV that uses Hyundai Motor Group's highly competent E-GMP platform seen in the Ioniq 5 and 6. Ahead of that reveal, the automaker shared some teaser images.
Regular readers will know that Hyundai's various design directions always have interesting names, and like the two smaller Ioniqs, the Ioniq 9 will feature "parametric pixels" in its headlamps—the blocky 8-bit look has been used to good effect on the Ioniq 5 and Ioniq 6.
E-GMP has already given rise to a big electric three-row SUV. Kia's EV9 has been on sale for a while now and has just about matched the cheaper EV6 in terms of sales for the last nine months, despite a $54,900 starting price that's more than $12,000 greater than the EV6's. (American car buyers really do want larger cars, and they vote with their wallets.)
Apple refreshes MacBook Pro lineup with M4 chips, introduces the M4 Max
Apple is following the M4 iMac and the redesigned Mac mini updates with one more major refresh this week: a new lineup of M4 MacBook Pros. These updates mostly follow the template set by last year's M3 MacBook Pro refresh: there's a 14-inch $1,599 base model with the standard M4, and then beefed up 14- and 16-inch versions with the M4 Pro and M4 Max processors that also offer more RAM, storage, an optional nano-texture display finish, and other amenities for power users.
All three versions of the M4 MacBook Pro are available for preorder today and begin arriving November 8, the same date as the new iMac and Mac mini refreshes.
New chips, same designsEven without the M4's improvements, the new $1,599 MacBook Pro addresses the biggest gripe about the original: it upgrades the base model from 8GB to 16GB of RAM without increasing the price. If this was the only change Apple made, it would have been a good upgrade (and the company has taken exactly that approach to updating the M2 and M3 MacBook Airs, which also start with 16GB beginning today). Base storage still starts at 512GB.
AI, cloud boost Alphabet profits by 34 percent
Alphabet’s profit jumped 34 percent in the third quarter as the parent company of search giant Google reported strong growth in its cloud business amid robust demand for computing and data services used to train and run generative artificial intelligence models.
The solid results released on Tuesday helped alleviate investors’ fears about the financial returns on the vast sums being spent on AI by Alphabet and other Big Tech peers as they seek to dominate the nascent sector. The standout unit was Google Cloud, where revenue increased 35 percent to $11.4 billion and operating profit increased sevenfold to $1.9 billion from $266 million in the same period last year.
Net income was $26.3 billion compared with $19.7 billion in the same period a year earlier, exceeding analysts’ expectations for $22.8 billion. Revenue rose 15 percent to $88.3 billion in the three months through to the end of September, beating the average estimate for $86.3 billion.
The New Glenn rocket’s first stage is real, and it’s spectacular
Blue Origin took another significant step toward the launch of its large New Glenn rocket on Tuesday night by rolling the first stage of the vehicle to a launch site at Cape Canaveral, Florida.
Although the company's rocket factory in Florida is only a few miles from Launch Complex 36 at Cape Canaveral Space Force Station, because of the rocket and transporter's size, the procession had to follow a more circuitous route. In a post on LinkedIn, Blue Origin's chief executive, Dave Limp, said the route taken by the rocket to the pad is 23 miles long.
Limp also provided some details on GERT, the company's nickname for the "Giant Enormous Rocket Truck" devised to transport the massive New Glenn first stage.
Here’s the paper no one read before declaring the demise of modern cryptography
There’s little doubt that some of the most important pillars of modern cryptography will tumble spectacularly once quantum computing, now in its infancy, matures sufficiently. Some experts say that could be in the next couple decades. Others say it could take longer. No one knows.
The uncertainty leaves a giant vacuum that can be filled with alarmist pronouncements that the world is close to seeing the downfall of cryptography as we know it. The false pronouncements can take on a life of their own as they’re repeated by marketers looking to peddle post-quantum cryptography snake oil and journalists tricked into thinking the findings are real. And a new episode of exaggerated research has been playing out for the past few weeks.
All aboard the PQC hype trainThe last time the PQC—short for post-quantum cryptography—hype train gained this much traction was in early 2023, when scientists presented findings that claimed, at long last, to put the quantum-enabled cracking of the widely used RSA encryption scheme within reach. The claims were repeated over and over, just as claims about research released in September have for the past three weeks.
These hornets break down alcohol so fast that they can’t get drunk
Many animals, including humans, have developed a taste for alcohol in some form, but excessive consumption often leads to adverse health effects. One exception is the Oriental hornet. According to a new paper published in the Proceedings of the National Academy of Sciences, these hornets can guzzle seemingly unlimited amounts of ethanol regularly and at very high concentrations with no ill effects—not even intoxication. They pretty much drank honeybees used in the same experiments under the table.
“To the best of our knowledge, Oriental hornets are the only animal in nature adapted to consuming alcohol as a metabolic fuel," said co-author Eran Levin of Tel Aviv University. "They show no signs of intoxication or illness, even after chronically consuming huge amounts of alcohol, and they eliminate it from their bodies very quickly."
Per Levin et al., there's a "drunken monkey" theory that predicts that certain animals well-adapted to low concentrations of ethanol in their diets nonetheless have adverse reactions at higher concentrations. Studies have shown that tree shrews, for example, can handle concentrations of up to 3.8 percent, but in laboratory conditions, when they consumed ethanol in concentrations of 10 percent or higher, they were prone to liver damage.
GitHub Copilot moves beyond OpenAI models to support Claude 3.5, Gemini
The large language model-based coding assistant GitHub Copilot will switch from exclusively using OpenAI's GPT models to a multi-model approach over the coming weeks, GitHub CEO Thomas Dohmke announced in a post on GitHub's blog.
First, Anthropic's Claude 3.5 Sonnet will roll out to Copilot Chat's web and VS Code interfaces over the next few weeks. Google's Gemini 1.5 Pro will come a bit later.
Additionally, GitHub will soon add support for a wider range of OpenAI models, including GPT o1-preview and o1-mini, which are intended to be stronger at advanced reasoning than GPT-4, which Copilot has used until now. Developers will be able to switch between the models (even mid-conversation) to tailor the model to fit their needs—and organizations will be able to choose which models will be usable by team members.
The Ars redesign 9.0.2 brings the text options you’ve requested
Readers of those other sites may not care much about font size and column widths. "40-character line lengths? In 18-point Comic Sans? I love it!" they say. But not you, because you are an Ars reader. And Ars readers are discerning. They have feelings about concepts like "information density." And we want those feelings to be soft and cuddly ones.
That's why we're today rolling out version 9.0.2 of the Ars Technica site redesign, based on your continued feedback, with a special emphasis on text control. (You can read about the changes in 9.0.1 here.) That's right—we're talking about options! Font size selection, colored hyperlink text, even a wide column layout for subscribers who plonk down a mere $25/year (possible because we don't need to accommodate ads for subs).
Here's a quick visual look at some of the main changes:
“Impact printing” is a cement-free alternative to 3D-printed structures
Recently, construction company ICON announced that it is close to completing the world’s largest 3D-printed neighborhood in Georgetown, Texas. This isn’t the only 3D-printed housing project. Hundreds of 3D-printed homes are under construction in the US and Europe, and more such housing projects are in the pipeline.
There are many factors fueling the growth of 3D printing in the construction industry. It reduces the construction time; a home that could take months to build can be constructed within days or weeks with a 3D printer. Compared to traditional methods, 3D printing also reduces the amount of material that ends up as waste during construction. These advantages lead to reduced labor and material costs, making 3D printing an attractive choice for construction companies.
A team of researchers from the Swiss Federal Institute of Technology (ETH) Zurich, however, claims to have developed a robotic construction method that is even better than 3D printing. They call it impact printing, and instead of typical construction materials, it uses Earth-based materials such as sand, silt, clay, and gravel to make homes. According to the researchers, impact printing is less carbon-intensive and much more sustainable and affordable than 3D printing.
TSA silent on CrowdStrike’s claim Delta skipped required security update
Delta and CrowdStrike have locked legal horns, threatening to drag out the aftermath of the worst IT outage in history for months or possibly years.
Each refuses to be blamed for Delta's substantial losses following a global IT outage caused by CrowdStrike suddenly pushing a flawed security update despite Delta and many other customers turning off auto-updates.
CrowdStrike has since given customers more control over updates and made other commitments to ensure an outage of that scale will never happen again, but Delta isn't satisfied. The airline has accused CrowdStrike of willfully causing losses by knowingly deceiving customers by failing to disclose an unauthorized door into their operating systems that enabled the outage.
How The New York Times is using generative AI as a reporting tool
The rise of powerful generative AI models in the last few years has led to plenty of stories of corporations trying to use AI to replace human jobs. But a recent New York Times story highlights the other side of that coin, where AI models simply become a powerful tool aiding in work that still requires humanity's unique skillset.
The NYT piece in question isn't directly about AI at all. As the headline "Inside the Movement Behind Trump’s Election Lies" suggests, the article actually reports in detail on how the ostensibly non-partisan Election Integrity Network "has closely coordinated with the Trump-controlled Republican National Committee." The piece cites and shares recordings of group members complaining of "the left" rigging elections, talking of efforts to "put Democrats on the defensive," and urging listeners to help with Republican turnout operations.
To report the piece, the Times says it sifted through "over 400 hours of conversations" from weekly meetings by the Election Integrity Network over the last three years, as well as "additional documents and training materials." Going through a trove of information that large is a daunting prospect, even for the team of four bylined reporters credited on the piece. That's why the Times says in a note accompanying the piece that it "used artificial intelligence to help identify particularly salient moments" from the videos to report on.
Ban on Chinese tech so broad, US-made cars would be blocked, Polestar says
Today, Polestar electric vehicles gained access to the Tesla Supercharger network. That means US Polestar drivers have access to 17,800 more DC fast chargers than they did yesterday—once they get a NACS adapter, which can also be ordered today from their local Polestar service point. But right now, Polestar has bigger worries than expanding its charging options. Should proposed new rules banning Chinese connected-car software and hardware go into effect, they would effectively ban the automaker from the US market, the company says, including the EVs it builds in South Carolina.
The rule would ban Chinese connected-car software from US roads from model-year 2027 (midway through 2026) and Chinese connected car hardware from model-year 2030.
The ban on Chinese connected-car technology is the latest in a series of protectionist moves from the federal government and Congress. The revamped clean vehicle tax credit no longer applies to EVs made in China or with Chinese components in their battery packs, and the US Commerce Department has been pressuring Mexico to not offer generous incentives to Chinese automakers looking to set up shop nearby. Chinese-made EVs have also been subject to a 100 percent tariff since May.
Apple’s first Mac mini redesign in 14 years looks like a big aluminum Apple TV
Apple's week of Mac announcements continues today, and as expected, we're getting a substantial new update to the Mac mini. Apple's least-expensive Mac, the mini, is being updated with new M4 processors, plus a smaller design that looks like a cross between an Apple TV box and a Mac Studio—this is the mini's first major design change since the original aluminum version was released in 2010. The mini is also Apple's first device to ship with the M4 Pro processor, a beefed-up version of the M4 with more CPU and GPU cores, and it's also the Mac mini's first update since the M2 models came out in early 2023.
The cheapest Mac mini will still run you $599, which includes 16GB of RAM and 256GB of storage; as with yesterday's iMac update, this is the first time since 2012 that Apple has boosted the amount of RAM in an entry-level Mac. It's a welcome upgrade for every new Mac in the lineup that's getting it, but the $200 that Apple previously charged for the 16GB upgrade makes an even bigger difference to someone shopping for a $599 system than it does for someone who can afford a $999 or $1,299 computer.
The M4 Pro Mac mini starts at $1,399, a $100 increase from the M2 Pro version. Both models go up for preorder today and will begin arriving on November 8.
A candy engineer explains the science behind the Snickers bar
It’s Halloween. You’ve just finished trick-or-treating and it’s time to assess the haul. You likely have a favorite, whether it’s chocolate bars, peanut butter cups, those gummy clusters with Nerds on them, or something else.
For some people, including me, one piece stands out—the Snickers bar, especially if it’s full-size. The combination of nougat, caramel, and peanuts coated in milk chocolate makes Snickers a popular candy treat.
As a food engineer studying candy and ice cream at the University of Wisconsin-Madison, I now look at candy in a whole different way than I did as a kid. Back then, it was all about shoveling it in as fast as I could.
How can you write data to DNA without changing the base sequence?
Zettabytes—that’s 1021 bytes—of data are currently generated every year. All of those cat videos have to be stored somewhere, and DNA is a great storage medium; it has amazing data density and is stable over millennia.
To date, people have encoded information into DNA the same way nature has, by linking the four nucleotide bases comprising DNA—A, T, C, and G—into a particular genetic sequence. Making these sequences is time-consuming and expensive, though, and the longer your sequence, the higher chance there is that errors will creep in.
But DNA has an added layer of information encoded on top of the nucleotide sequence, known as epigenetics. These are chemical modifications to the nucleotides, specifically altering a C when it comes before a G. In cells, these modifications function kind of like stage directions; they can tell the cell when to use a particular DNA sequence without altering the “text” of the sequence itself. A new paper in Nature describes using epigenetics to store information in DNA without needing to synthesize new DNA sequences every time.
Ars Live: What else can GLP-1 drugs do? Join us today for a discussion.
News and talk of GLP-1 drugs are everywhere these days—from their smash success in treating Type 2 diabetes and obesity to their astronomical pricing, drug shortages, compounding disputes, and what sometimes seems like an ever-growing list of other conditions the drugs could potentially treat. There are new headlines every day.
Although the drugs have abruptly stolen the spotlight in recent years, researchers have been toiling away at developing and understanding them for decades, stretching back to the 1970s. Despite all the time and effort, the drugs still hold mysteries and unknowns. For instance, researchers thought for years that they worked directly in the gut to decrease blood sugar levels and make people feel full. After all, the drugs mimic an incretin hormone, glucagon-like peptide-1, that does exactly that. But, instead, studies have since found that they work in the brain.
In fact, the molecular receptors for GLP-1 are sprinkled in many places around the body. They're found in the central nervous system, the heart, blood vessels, liver, and kidney. Their presence in the brain even plays a role in inflammation. As such, research on GLP-1 continues to flourish as scientists work to understand the role it could play in treating a range of other chronic conditions.
For some reason, NASA is treating Orion’s heat shield problems as a secret
For those who follow NASA's human spaceflight program, when the Orion spacecraft's heat shield cracked and chipped away during atmospheric reentry on the unpiloted Artemis I test flight in late 2022, what caused it became a burning question.
Multiple NASA officials said Monday they now know the answer, but they're not telling. Instead, agency officials want to wait until more reviews are done to determine what this means for Artemis II, the Orion spacecraft's first crew mission around the Moon, officially scheduled for launch in September 2025.
"We have gotten to a root cause," said Lakiesha Hawkins, assistant deputy associate administrator for NASA's Moon to Mars program office, in response to a question from Ars on Monday at the Wernher von Braun Space Exploration Symposium in Huntsville, Alabama.