Remedy is remaking the first two Max Payne games for PC, PS5 and Xbox Series X/S

Remedy is remaking Max Payne and Max Payne 2: The Fall of Max Payne. In a surprise announcement on Wednesday, the Finnish studio said it was working with Rockstar Games to fully remaster the first two games in its cult classic third-person shooter series for PC, PlayStation 5 and Xbox Series X/S.

Under a new publishing agreement between the two companies, Rockstar will fund the project “in line with a typical Remedy AAA-game production.” What’s more, the studio will rebuild the games in its in-house Northlight Engine, the same engine Remedy used for its most recent title, Control. Nearly three years after its 2019 release, Control is still one of the best-looking games on both PC and consoles thanks to its implementation of ray-tracing.

“We were thrilled when our long-time friends at Remedy approached us about remaking the original Max Payne games,” said Rockstar Games co-founder Sam Houser. “We are massive fans of the work the Remedy team has created over the years and we can’t wait to play these new versions.”

Released in 2001, Max Payne was the game that put Remedy on the map and established the studio’s signature storytelling style. It was also one of the first games to include the bullet-time effect made popular by The Matrix. Two years later, Remedy released The Fall of Max Payne. Rockstar published both games before it went on to develop the third and currently final entry in the series on its own.

MLB’s latest streaming deal brings Sunday games to Peacock

Major League Baseball and NBCUniversal’s Peacock have reached a deal that will see 18 games throughout the 2022-23 season broadcast on the streaming service, per the latter’s tweet Thursday.

The listed Sunday games will start between 11:30 am and noon ET, earlier than they would have in the past (sorry West Coast), so as to minimize interference with the Sunday afternoon games that start at 1 pm ET. The MLB already has an existing partnership with ESPN for the broadcast rights to Sunday Night Baseball. The SiriusXM All-Star Futures Game will reportedly be a Peacock exclusive this season was well.

Though the details of the arrangement have not yet formally been announced, Forbes reported in March that this will likely be a two-year deal worth $30 million annually, available only on Peacock’s premium $10-a-month tier and exclusive, in that only local market viewers will be able to watch without ponying up for a subscription — at least for that month the game you want to watch is airing. Additionally, MLB has struck a deal with Apple TV+ to broadcast its Friday Night Doubleheaders, those games start at 7pm ET, just like ESPN’s Sunday Night Baseball matchups. 

All of this broadcast hodgepodge is in addition to the MLB’s existing MLB.TV streaming service as well as a rumored “national service” that would purportedly eliminate local blackouts for streamers and attract fans from among cord-cutters. In all, the MLB’s national media deals will total $1.96 billion this season, a 26 percent increase from last year, per Forbes. So if you want to watch out-of-market baseball this year, you’d better have your password list and debit card ready.

Apple can now fix Face ID on the iPhone X without replacing the whole device

Apple debuted a program in March that let it repair Face ID on newer iPhones without replacing the whole device, but it left the iPhone X out of the equation — frustrating if you’re determined to use that ‘classic’ handset for a while longer. This shouldn’t be a problem for now on, though. MacRumors has learned that Apple and authorized repair centers can now repair Face ID on the iPhone X without requiring a full-on replacement. Your much-loved handset should otherwise remain intact.

The initial program only offered these more targeted repairs for the iPhone XS and newer models. The update expands support to all iPhones with Face ID.

The expansion comes as Apple rethinks its overall stance on repairs. The company made it easier for third-party repair shops to fix displays without breaking Face ID, and has announced a self-service repair program. While these moves may be in response to public and regulatory pressure, they’re welcome news for anyone hoping to extend the life of an Apple gadget without hefty fees or unnecessary device swaps.

Xbox controllers can now switch TV input back to your console

You won’t have to reach for a remote the next time you’re ready to return to your Xbox after a TV session. Microsoft is rolling out an update that lets you press the Xbox button on your controller to switch the input to your Series X or Series S. You can flip to cable TV during a download knowing that you just have to grab your controller when you’re ready to play.

The feature depends on the newer Xbox consoles’ support for HDMI-CEC. You can enable it through the “Sleep mode & startup” settings section through the “TV & A/V power options” selection.

The concept isn’t completely new, whether in consoles or for HDMI-CEC devices in general Even so, it’s difficult to complain when this could save you time wading through TV menus just to get back to Halo Infinite.

Police reports suggest a larger pattern of AirTag stalking

It’s been clear for a while that bad actors are planting location trackers on other people without their knowledge to track their locations. Trackers have been used in car thefts as well. Now, Motherboard has obtained some police data that casts some light on extent of the issue.

The publication requested records mentioning Apple AirTags (which the company announced a year ago) from dozens of police departments from across the US. The requests covered an eight-month period. 

Motherboard received 150 reports from eight police departments and found that, in 50 cases, women called the cops because they received notifications suggesting that someone was tracking them with an AirTag or they heard the device chiming. Half of those women suspected the tags were planted in their car by a man they knew, such as a current or former romantic partner or their boss.

The vast majority of the reports were filed by women. There was just one case in which a man made a report after suspecting that an ex was using an AirTag (which costs just $29) to stalk him. Around half of the reports mentioned AirTags in the contexts of thefts or robberies.

Just one instance of AirTag-related stalking would be bad enough. Fifty reports in eight jurisdictions in eight months is a not insignificant number and there are likely other cases elsewhere that haven’t been disclosed. Engadget has contacted Apple for comment.

Although iPhones already automatically detect unwanted nearby AirTags, Apple said in February it will do more to mitigate the issue. Later this year, it will rollout an AirTag precision tracking feature for iPhone 11, 12 and 13 to help people more easily find unknown trackers. It will also inform iPhone users more clearly when someone may be using an AirTag to follow them.

In December, Apple released an Android app to help people using phones powered by that OS detect errant AirTags. Tile updated its Android and iOS apps with a similar feature. But those require users to both be aware of the threat posed by unwanted trackers and to scan for them manually. Last week, however, it emerged that Google is exploring OS-level tracker detection for Android, which could help keep people safe should it roll out the feature.

THQ Nordic will host a digital game showcase on August 12th

THQ Nordic will host its second annual digital showcase on August 12th, the publisher announced on Wednesday. The company said it would announce new games as well as share updates on previously announced ones during the event, with the entire proceedings available to watch via Twitch, Steam and YouTube starting at 3PM ET. 

Based on the trailer the company shared, fans can look forward to updates on SpongeBob SquarePants: The Cosmic Shake and Destroy All Humans! 2 – Reprobed, among other titles. With the Entertainment Software Association not moving forward with an E3 this year, it’s likely more publishers will soon announce similar events timed for the summer months of the year. In announcing its showcase so early and months after the traditional start of E3, it’s likely the company wants to create a space for itself where it doesn’t have to compete with bigger publishers like Microsoft and Sony for attention.

Meta won’t host its F8 developer conference this year

Meta still isn’t keen on reviving its F8 conference, but this time it’s not due to the pandemic. The Facebook parent firm revealed that it “will not hold” F8 in 2022 as it retools for “building the metaverse.” Instead, Meta will lean on Conversations (its first business messaging event), Connect and other developer presentations throughout the rest of the year.

The company hasn’t held a live F8 since 2019. Both F8 2020 and 2021 were online-only due to COVID-19. This isn’t the first time Meta skipped F8 altogether, though. It didn’t hold conferences in 2009, 2012 and 2013.

A move like this isn’t shocking. Facebook only rebranded itself as Meta in October, and the company is still in the early days of defining its metaverse vision and creating relevant tools. The F8 2022 no-show could give Meta more time to present a clearer strategy, not to mention create more augmented reality and virtual reality technology it can pitch to creators.

Twitter appears to have quietly altered a key way deleted tweets can be preserved

Twitter might finally be delivering an edit button, but the company appears to have quietly altered a key way deleted tweets can be preserved. As writer Kevin Marks first pointed out, the company changed its embedded javascript so that the text of deleted tweets is no longer visible in embeds on outside websites.

Previously, the text of a deleted tweet was still visible on web pages on which it had been embedded, but now Twitter is using javascript to render the tweet as a blank white box. Overall, it might not seem like a major change on Twitter’s part, but it’s one that has significant implications. Tweets from public officials, celebrities and the general public are frequently embedded into news stories. Even if those tweets were later deleted, there was a clear record of what had been said.

Now, there are untold numbers of old articles where instead of a tweet there’s just a blank box without context. For example, tweets from former President Donald Trump were routinely cited by media organizations. Even after his account was permanently suspended, the text of those missives was still viewable on the sites where it had been embedded. Now, that’s no longer the case.

In Trump’s case, there are extensive archives of those tweets. But that’s not the case for the majority of Twitter users, or even many public officials. And while it’s still technically possible to view the text by disabling javascript in your browser, it’s not the kind of step most people would know how to do even if they knew the option existed.

Twitter product manager Eleanor Harding told Marks the change was made “to better respect when people have chosen to delete their Tweets.” A spokesperson for Twitter declined to comment further on the change. 

Still, it’s a curious move because, as Marks points out in his post, Twitter’s original choice to maintain the text of deleted tweets was an intentional choice on the part of Twitter engineers. “If it’s deleted, or 1000 years in the future, the text remains,” former Twitter engineer Ben Ward wrote in 2011 when embedding tweets was first announced.

That’s in line with statements from other twitter executives over the years about the importance of Twitter as a kind of “public record.” For example, former CEO Jack Dorsey said in 2018 he was hesitant to build an edit button because it could erode Twitter’s ability to function as a public record. “It’s really critical that we preserve that,” he said at the time.

OpenAI’s DALL-E 2 produces fantastical images of most anything you can imagine

In January, 2021, the OpenAI consortium — founded by Elon Musk and financially backed by Microsoft — unveiled its most ambitious project to date, the DALL-E machine learning system. This ingenious multimodal AI was capable of generating images (albeit, rather cartoonish ones) based on the attributes described by a user — think “a cat made of sushi” or “an x-ray of a Capybara sitting in a forest.” On Wednesday, the consortium unveiled DALL-E’s next iteration which boasts higher resolution and lower latency than the original. 

A bowl of soup that looks like a monster, knitted out of wool.
OpenAI

The first DALL-E (a portmanteau of “Dali,” as in the artist, and “WALL-E,” as in the animated Disney character) could generate images as well as combine multiple images into a collage, provide varying angles of perspective, and even infer elements of an image — such as shadowing effects — from the written description. 

“Unlike a 3D rendering engine, whose inputs must be specified unambiguously and in complete detail, DALL·E is often able to ‘fill in the blanks’ when the caption implies that the image must contain a certain detail that is not explicitly stated,” the OpenAI team wrote in 2021.

Macro 35mm film photography of a large family of mice wearing hats cozy by the fireplace.
OpenAI

DALL-E was never intended to be a commercial product and was therefore somewhat limited in its abilities given the OpenAI team’s focus on it as a research tool, it’s also been intentionally capped to avoid a Tay-esque situation or the system being leveraged to generate misinformation. Its sequel has been similarly sheltered with potentially objectionable images preemptively removed from its training data and a watermark indicating that its an AI-generated image automatically applied. Additionally, the system actively prevents users from creating pictures based on specific names. Sorry, folks wondering what “Christopher Walken eating a churro in the Sistine Chapel” would look like.    

DALL-E 2, which utilizes OpenAI’s CLIP image recognition system, builds on those image generation capabilities. Users can now select and edit specific areas of existing images, add or remove elements along with their shadows, mash-up two images into a single collage, and generate variations of an existing image. What’s more, the output images are 1024px squares, up from the 256px avatars the original version generated. OpenAI’s CLIP was designed to look at a given image and summarize its contents in a way humans can understand. The consortium reversed that process, building an image from its summary, in its work with the new system.

Teddy bears mixing sparkling chemicals as mad scientists.
OpenAI

“DALL-E 1 just took our GPT-3 approach from language and applied it to produce an image: we compressed images into a series of words and we just learned to predict what comes next,” OpenAI research scientist Prafulla Dhariwal told Verge.

Unlike the first, which anybody could play with on the OpenAI website, this new version is currently only available for testing by vetted partners who themselves are limited in what they can upload or generate with it. Only family-friendly sources can be utilized and anything involving nudity, obscenity, extremist ideology or “major conspiracies or events related to major ongoing geopolitical events” are right out. Again, sorry to the folks hoping to generate “Donald Trump riding a naked, COVID-stricken Nancy Pelosi like a horse through the US Senate on January 6th while doing a Nazi salute.”

A photo of an astronaut riding a horse.
OpenAI

The current crop of testers are also banned from exporting their generated works to a third-party platform though OpenAI is considering adding DALL-E 2’s abilities to its API in the future. If you want to try DALL-E 2 for yourself, you can sign up for the waitlist on OpenAI’s website.