LightBlog

jeudi 20 août 2020

Google’s first feature drop for the Pixel Buds adds attention alerts, bass boost, and transcribe mode

The second-generation Pixel Buds from Google are truly wireless earbuds that pack a ton of features at a price of $179. Just like Google Pixel smartphones, the Google Pixel Buds are getting “feature drops” that add a bevy of new features and improvements. The first of such “feature drops” is here for the second-gen Pixel Buds, as announced in a Google blog post, and they bring some substantial improvements.

The first of these improvements is sharing detection, which detects if you’re sharing one earbud with someone and then lets you tweak the volume independently on each earbud. This way, you don’t have to sacrifice your volume preferences when you’re listening to music or watching a movie with someone else. And the new bass boost toggle can add a bit of extra oomph to music.

Another new feature that is being added with this update is a “transcribe” mode that can help you understand another language better by having translated speech read directly into your ear. This feature is launching initially for French, German, Italian, and Spanish speakers to translate English speech. For example, a Spanish speaker can ask Google Assistant on their Pixel Buds “Hey Google, ayúdame a entender inglés” to start translating and transcribing English to Spanish in the Google Translate app. Translation isn’t instantaneous and requires audio to be transmitted to Google for processing.

There’s also an experimental “Attention Alerts” feature that can detect when something important happens near you that may need your immediate attention. The feature detects things like a dog barking, a baby crying, or an emergency vehicle driving by and temporarily lowers the volume of whatever you’re listening to so you can focus on your surroundings.

As previously announced, you can now find your Google Pixel Buds in the Find My Device app. This also works for other Fast Pair-enabled devices, too. Next, you can now ask Google Assistant to toggle touch control or check the battery life on your earbuds. Finally, people who have been facing connection issues with their buds—an issue that has plagued users since the Pixel Buds were first released a few months back—will be glad to know that this update also addresses some of those issues, according to Android Police. So be sure to update to firmware version 550 if you’ve been facing these issues.

Google Pixel Buds (Free, Google Play) →

Starting today, the Google Pixel Buds are now available in Oh So Orange, Quite Mint, and Almost Black in the U.S. for $179 on the Google Store.

The post Google’s first feature drop for the Pixel Buds adds attention alerts, bass boost, and transcribe mode appeared first on xda-developers.



from xda-developers https://ift.tt/3hgX7XZ
via IFTTT

Google Autofill on Android now supports biometric authentication

Back in January, we discovered that Google was testing biometric authentication support for its first-party Autofill service. Now, this feature appears to be rolling out to Google Autofill via a recent update to Google Play Services.

The new feature means that users can require successful biometric authentication before information can be filled into a form. Before, Google’s Autofill service had no additional safeguards on top of the user’s lockscreen, making it less secure than most third-party password managers.

Google Autofill security

According to Android Police, the feature appears to have become widely available for users on Google Play Services v20.33.13, but it’s also available for some on version v20.26.14. It appears this is rolling out via a server-side update, so it’s just a matter of waiting for a flag to be flipped on your device.

The new feature can be found by heading to Settings > Google > Autofill > Autofill with Google. You’ll see a new menu called Autofill Security, where you can toggle biometrics. Once toggled, users can use any biometric authentication hardware that’s supported by the BiometricPrompt API, which includes fingerprint scanners, iris scanners, or secure facial recognition hardware like on the Pixel 4 and Pixel 4 XL. Of course, you’ll also need to turn on the Autofill with Google service in Settings > System > Languages & input > Autofill service.

Adding biometric authentication not only makes logging in more secure, but it also makes it more convenient and accessible to users.

The post Google Autofill on Android now supports biometric authentication appeared first on xda-developers.



from xda-developers https://ift.tt/2Q9En0y
via IFTTT

[Update: Rolling out] Fossil Gen 5 smartwatches will soon get sleep tracking and VO2 monitoring

Update 1 (08/20/2020 @ 2:00 PM ET): The big update for gen 5 smartwatches from Fossil is now rolling out. Scroll to the bottom for more information. The article as published on August 10, 2020, is preserved below.

Fossil Gen 5 smartwatch owners are in for a treat as their year-old smartwatches will soon be getting a host of new features via a software update. The update, which is scheduled to roll out on August 19th, includes a new sleep tracking feature, VO2 max monitoring, and more. According to a recent report from Droid-Life, Fossil describes the update as a “Wellness app rollout” that will allow users to start a run, measure cardio fitness levels, and track their sleep using their Gen 5 smartwatch.

The Wellness app could be an entirely new app for Fossil smartwatches that users will have to download on their smartphone or access through the smartwatch. If that’s the case, users will be able to easily access all their fitness tracking information in one convenient location.

Fossil Gen 5 smartwatches

Furthermore, a report from Engadget points out that the update will include new avatars for contacts, easier access to “key tools” and new battery modes. The battery modes are especially interesting as they’ll let users extend the battery life of their smartwatches to at least 24 hours on a single charge.

Along with the features mentioned above, the new Wellness app will also let users track VO2 max readings, which will prove to be quite useful for those of you who are serious about working out. While Fossil’s Gen 5 smartwatches won’t be the first wearables to feature VO2 max monitoring, it’s great to see Fossil bringing the feature to the year-old lineup. As of now, Fossil has released no further information about the update. We’ll update this post as soon as we learn more from the company.

Update: Rolling Out

As reported by Droid-Life, the big update bringing new health and wellness features is rolling out to fifth-generation smartwatches from Fossil. The update an optimized activity tracker, sleep tracking, cardio fitness tracking, a new UI and features for battery modes, and phone app updates. Fossil boasts the update allows for essentially 24/7 sleep tracking thanks to the fact that their smartwatches can be charged to 80% in less than an hour. The new Wellness app can monitor real-time metrics such as heart rate, pace, distance, steps, calories, and more using half as much battery life as before by shifting processing off the main application processor.

The post [Update: Rolling out] Fossil Gen 5 smartwatches will soon get sleep tracking and VO2 monitoring appeared first on xda-developers.



from xda-developers https://ift.tt/2XOrYDw
via IFTTT

Microsoft’s Eye Contact feature goes live on the Surface Pro X to keep your gaze focused in video calls

Update 1 (08/20/2020 @ 1:45 PM ET): This feature is now generally available for users of the Microsoft Surface Pro X. Scroll to the bottom for more information. The article as published on July 24, 2020, is preserved below.

At the Surface event late last year, Microsoft unveiled the new Surface Pro X — an ARM version of the Surface Pro 7 powered by a custom Qualcomm Snapdragon 8cx processor (AKA Microsoft SQ1). During the Surface Pro X presentation, the company also showcased a new technology exclusive to the notebook, which utilized the power of AI to automatically focus a user’s gaze at the camera during video calls. However, the feature wasn’t available on the Surface Pro X when it first went on sale late last year in November. Now, almost a year after the original announcement, Microsoft has finally started rolling out the Eye Contact feature in the latest Windows 10 Insider Preview build 20175.

The all-new feature relies on the artificial intelligence capabilities offered by Microsoft’s custom SQ1 processor and automatically adjusts the user’s gaze during video calls. According to a recent blog post from the company, Surface Pro X users who are enrolled in the Windows Insider program can now enable the Eye Contact feature from within the Surface app on their devices.

Microsoft Surface Pro X Eye Contact feature

Since the new Eye Contact feature makes use of Microsoft’s SQ1 processor, it will only be available on the Surface Pro X and won’t be released for other devices running the latest Windows 10 Insider Preview build. Along with the new Eye Contact feature, the latest Windows 10 Insider Preview build brings improvements to pinned sites in Microsoft Edge, new icons for stock apps like Sticky Notes and Snip & Sketch, and several dev-focused changes.

It’s worth noting that Microsoft isn’t the only company to offer such a feature. Apple has also been working a similar technology called FaceTime Attention Correction, which was briefly available in beta releases of iOS 13 last year. While the technology was pulled from the final iOS 13 release, it should make it to users with iOS 14 later this year.

Update 1: Eye Contact generally available

Microsoft announced today that its AI-powered Eye Contact feature, which automatically adjusts your gaze on video calls and recordings to make it look like you’re staring directly at the camera, is now generally available for all owners of the Surface Pro X. When this feature first rolled out last month, it was limited to Surface Pro X users in the Windows 10 Insider Preview channel. Eye Contact works on video calling apps like Microsoft Teams, Skype, and others. Here’s a very short demo comparing a person’s gaze with and without Eye Contact turned on:

According to Steve Bathiche, Microsoft Technical Fellow, Eye Contact uses the dedicated AI silicon in the Microsoft SQ1 processor, Microsoft’s customized version of the Qualcomm Snapdragon 8cx. Since the processing required for this feature is offloading to the dedicated AI chip, Microsoft says Eye Contact won’t impact the battery life on the Surface Pro X. Microsoft says the Surface Pro X is the first Windows 10 PC to “fully offload AI onto a specialized chip,” but it’s no longer the only device on the market with the Snapdragon 8cx. As such, this feature won’t be available on other existing Surface devices.

The feature can be toggled on or off inside the Surface App, and once enabled, it’s automatically applied any time an app uses the camera. This feature is baked into the camera, so it doesn’t require apps to add support for it. Microsoft disabled the feature by default since the company believes that image-modifying features should be opt-in.

The post Microsoft’s Eye Contact feature goes live on the Surface Pro X to keep your gaze focused in video calls appeared first on xda-developers.



from xda-developers https://ift.tt/32Tw3tw
via IFTTT

Google beefs up its SOS alerts in Search and Maps with near real-time wildfire data

When you search for information about a wildfire in Google Search, you’ll now get results that show you key insights about the areas impacted by the disaster. Google today detailed its efforts to use satellite data to beef up its SOS alerts in Google Search and Google Maps. The search engine now shows wildfire boundary maps that allow the public to see exactly where a blaze is and how best to avoid it. Search results will also provide news articles and helpful resources from local emergency agencies.

Google is using satellite data from the National Oceanic and Atmospheric Administration’s (NOAA) GOES constellation of satellites and Google Earth Engine’s data analysis capabilities. This will allow Google to show the size of a wildfire in near real-time, with data refreshed hourly.

As Google explains:

NOAA’s satellites include infrared and optical sensors optimized for detecting “hot spots” or large wildfires on the Earth’s surface. We run computations on this data in Earth Engine to identify the affected area.

From there, we create a digital polygon—the red boundary shown on the map in Search and Google Maps—that represents the approximate wildfire impact area. If multiple large fires are active in the same region, people may see several polygons.

When a person searches for something broadly like “wildfire in California” or something specific like “Kincade fire”, they will see the approximate boundary, name, location, relevant news articles, and helpful resources from local emergency agencies that pertain to a nearby wildfire. In Maps, users will receive a warning if they approach an active blaze, with an ambient alert that points them to the latest information.

Google said it piloted the new feature in California to determine its usefulness and plans to roll it out to other areas in the world.

With heatwaves sweeping across parts of the U.S., the threat of fire to human life is high. Firefighters have already battled large blazes in Northern and Southern California, as well as in Colorado. With the help of satellite data, Google is hoping to provide the public with high-quality information, potentially saving lives across these regions.

In addition to providing near real-time wildfire data in Google Maps, Google recently detailed an initiative designed to use Android devices to detect earthquakes.

The post Google beefs up its SOS alerts in Search and Maps with near real-time wildfire data appeared first on xda-developers.



from xda-developers https://ift.tt/3aLU7AG
via IFTTT

Kernel sources for the OnePlus Nord and Realme X3 SuperZoom are now available

The OnePlus Nord has already received a taste of aftermarket development via an unofficial TWRP build and a pair of custom ROMs, but all of these builds were utilizing the pre-built kernel from the stock OxygenOS firmware. The unavailability of the official kernel source code for the Nord at launch is a bit strange, considering the fact that OnePlus has historically published day-1 kernel sources for many of their phones. The wait is now over, though, as OnePlus has finally refreshed its GitHub repo and uploaded the kernel source code for the OnePlus Nord.

OnePlus Nord XDA Forums

OnePlus Nord Review: Great Performance at a Great Price

The initial kernel source code release for the OnePlus Nord (code-name “avicii”) is based on OxygenOS 10.5.2 and not the latest OxygenOS 10.5.4. Nevertheless, the published sources should be useful to get the ball rolling on refining the stability of AOSP-based custom ROMs and aid in making custom kernels. You can take a look at the code by following the link below. Thanks to XDA Senior Member Some_Random_Username for the tip!

OnePlus Nord Kernel Sources

Besides OnePlus, Realme is another brand under BBK Electronics that maintains an admirable track record when it comes to publishing the kernel source code for the smartphones they sell. The company has now released the kernel source for the Realme X3 SuperZoom – an aggressively priced sub-flagship phone with some solid specifications like the Qualcomm Snapdragon 855+ SoC, a 120Hz display, and 30W Dart 3.0 fast charging.

Realme X3 SuperZoom: Kernel Sources ||| XDA Forums

Unlocking the bootloader of the Realme X3 SuperZoom is not a difficult task. And now that the kernel source for the phone is here, it should help kickstart third-party development for the device. Note that the base edition of the Realme X3 and the “SuperZoom” variant are almost identical except in their camera configurations, thus it could be possible that the aforementioned kernel source has been unified for both of them.

 

The post Kernel sources for the OnePlus Nord and Realme X3 SuperZoom are now available appeared first on xda-developers.



from xda-developers https://ift.tt/3kZySjo
via IFTTT

Wireless Android Auto works on all Android 11 devices with 5GHz Wi-Fi

Google first rolled out wireless Android Auto all the way back in 2018 but, at the time, the feature was limited to Nexus and Pixel devices running Android 8.0 Oreo. Soon after its launch, a few users managed to enable wireless Android Auto on a few non-Google devices and the company also extended support to some Samsung Galaxy flagships. While Google has since rolled out wireless Android Auto support in more regions, the number of officially supported devices is still too small. However, that’s expected to change with the Android 11 rollout later this year.

According to a recent report from 9to5Google, Google has updated its support page for Android Auto with a new note which states that “Any smartphone with Android 11.0” can use Android Auto wirelessly. This means that all Android devices that are expected to receive the Android 11 update later this year will be able to connect wirelessly to Android Auto. But there’s one catch.

Android Auto wireless Android 11

In order to connect to Android Auto wirelessly, devices will need to be able to connect to 5GHz WiFi networks. Google further adds that some EU residents may not be able to use wireless Android Auto even after receiving the Android 11 update, as the EU has specific requirements for 5GHz WiFi being used in cars. Similarly, users in countries like Japan and Russia won’t be able to use Android Auto wirelessly.

The update comes at a time when Google is actively extending Android Auto support to more car manufacturers and app developers. The company recently revealed that Android Auto is on track to be in more than 100 million cars in the coming months and the platform is set to receive a host of new features that will help users with navigation, parking, and electric vehicle charging.


Source: Android Auto Help

Via: 9to5Google

The post Wireless Android Auto works on all Android 11 devices with 5GHz Wi-Fi appeared first on xda-developers.



from xda-developers https://ift.tt/3l1YFay
via IFTTT