LightBlog

mardi 13 décembre 2016

Makers of the CoWatch Have Been Acquired by Google

Cronologics was founded in 2014 by Lan Rcohe, Leor Stern, and John Lagerling. All three of these people had previously worked at Google in business development, but then decided to leave the company. They formed this new company and for the past two years have been working on a way to build "compelling wearable hardware". Then, in April of this year their team launched an Indiegogo campaign for their first product.

The wearable was dubbed the CoWatch, and its main selling point was its integration with Amazon Alexa. It was marketed as the "world's first Amazon Alexa-enabled smart watch", with the campaign being funded on June 3rd. The team was able to crowdfund over $293,000; overshooting their goal by 281%. The team did see an official product launch too, when they started selling it on Amazon in September for $280.

This week, it's been confirmed that the Cronologics team has been acquired by Google. While the details of the acquisition have yet to be revealed, we're told that the folks at Cronologics will now be working at Google's Android Wear division. Google has had to delay the launch of Android Wear 2.0 so this injection of new talent might be enough to get the project back on track. We'll just have to wait and see if there ends up being any CoWatch DNA found in Android Wear 2.0 and beyond.

The team at Cronologics had said they wanted to bring its CoWatch technology to other devices (like iOS). But it seems clear that the CoWatch era is over and plans to do anything with it are likely scrapped. The team had the backing of some major investors though, including CoinDesk, Student.com, Shakil Khan and the head of Spotify's special projects. So bringing this team back under the Google umbrella can do wonders for Android Wear's future given their potential.

Source: Cronologics



from xda-developers http://ift.tt/2hpeKtb
via IFTTT

CyanogenMod 14.1 Joins OnePlus 3T’s ROM Roster as Development Picks Up

The development scene on the OnePlus 3T is picking up, even for a device that has its predecessor as a partial competition in past sales. Due to the very nature of open source that promotes collaboration, and OnePlus's decision to release the kernel sources and the device tree of the OnePlus 3T, the device is on track to compete with its predecessor in the third party scene as well. The most surprising aspect is that all of these works have popped up in a matter of few days!

In addition to our previous story of the OnePlus 3T getting Sultanxda's custom CyanogenMod 13 builds, the device is seeing substantial developer interest for a phone that is barely a month old and just about getting into the hands of the consumers. Just a few hours ago, the device reached another community development milestone when it received its first public release of (unofficial) CyanogenMod 14.1 based on Android 7.1.1, courtesy of XDA Senior Member PeterCxy. For a first build, the ROM is pretty impressive as it shares several resources from OnePlus 3's CyanogenMod 14.1 builds, thereby having identical bugs (read: nothing major as the ROM works well, just some stability issues with the new OS).

But that is not all. In addition to the ROMs, custom kernel releases are also making their way to the device. XDA Recognized Developer franciscofranco also brought his famous franco.Kernel project to the OnePlus 3T. Franco kernel joins the rank with a few other notable kernels like XDA Recognized Contributor eng.stk's Blu Spark kernel and XDA Recognized Developer jcadduono's Kali NetHunter kernel.

There's still more. Here are the rest of the works currently available in our forums:

You can view all of that and more in our OnePlus 3T device forum and development subforum.

What are your thoughts on the OnePlus 3T's development scene? Let us know in the comments below!



from xda-developers http://ift.tt/2gDJhDX
via IFTTT

lundi 12 décembre 2016

How to use Kustom Live Wallpaper: Part 2

This video is the second installment in a series of video tutorials on how to use KLWP to create awesome live wallpapers. You might want to check out the first video before diving into this one. Watch Marco take you step-by-step through the process.

Before you get started on this tutorial, make sure you've seen part one. In this video you will begin where you've left off.kustom1

Watch the full video and follow along with the easy instructions to create this awesome wallpaper.

For more homescreen videos, check out this playlist on our YouTube channel.



from xda-developers http://ift.tt/2gy8wYm
via IFTTT

The Inherent Problems in Speed Test Videos and Their Validity as Performance Benchmarks

Clarkson, Hammond, and Mays line up, each of them driving one of the world's fastest "Hypercars." Millions of dollars metal, aluminum and carbon fiber waiting to barrel down the track, each wanting to take the throne as the fastest of the litter.

If you aren't familiar with the description above, this scene took place in the first episode of the new series from the Top Gear trio, "The Grand Tour". If you haven't seen it yet, you will soon see how it relates to this article, and if you have you probably already see the connection.


When one seeks to measure how their particular device performs, even if generally and unscientifically, a user would typically download a popular benchmark application like Geekbench 4, or PCMark. These applications stress the device in using a preset method and then spits out a result, usually a number, that you can use to assert your phone's prowess in relation to other devices in the market.

If you are a particularly savvy user, an application like DiscoMark tests the opening and closing speeds of applications, or if you are interested with storage speeds you can run a storage benchmark. One thing remains consistent though, and that is the objectivity of the test being run. Geekbench 4 will run my Pixel through the same tests as my LG V20. As long as I am running the same version of the application, these tests are mostly uniform, and the same goes for PCMark and the storage benchmark — there are background services sucking up processing power and other variables to control if one is to get a truly accurate result, but for the most part, the number will give you a general idea of your device's "tier." However, DiscoMark and other real-world benchmarks can throw quite a curveball into the mix, and is why DiscoMark scores need to be heavily vetted for accuracy and are not readily used — in fact, they shouldn't be without careful control for confounding variables throwing off the results. 

Finally, these applications have been designed to do just this; benchmark the overall or specific component performance of your device. One could argue that these tests don't reflect the actual real world performance, that it is more like a 0-60 time done at sea-level in controlled conditions by an experienced driver than an actual real-world test. Those arguments do have some validity to them and speaks heavily towards the rise in popularity of "speed test" videos in which a reviewer pits device A against device B, and throws at them a bevy of real world applications opened and closed in quick succession. But is there any sort of validity to these tests? And what sort of actual substantial knowledge, if any, could be gained through them?


Variance is the Spice of Life

As we mentioned above, applications like Geekbench 4 are tests done in a specific testing environment. They are self contained (although some draw system resources), and thus are less likely to be affected by the environment directly (however, the hardware also has other tasks to deal with while running the benchmark). Contrary to this, a speed test is open to a host of variables like touch response times, background processes, the amount of user data synced, which side of Google's beloved A/B testing a phone could be involved in, the application state, the unavoidable human error… I could go on, but the point is there, with speed tests there are more variables that can, and do affect the outcome.

For example, I did some very unscientific testing of my own between an LG V20 and a Google Pixel (5"), not unlike that of your typical speed test video, which displays one sample of the differences you can observe under various setups and starting states. It is of note that the V20 and Pixel tests were done with relatively the same user data installed as they were both running my daily driver account with my typical setup of 150+ applications installed and signed in.

Test LG V20 Google Pixel LG V20 (Clean Setup)
Clear All 38.79 27.62 31.86
Reboot 47.70 33.71 36.34
Reboot + Clear All 33.49 30.42 28.22
Cache Cleared + Reboot 32.67 36.40 30.25
Cache Cleared + Reboot + Clear All 26.91 25.93 24.78

All in all, there is an 11-20 second gap between the fastest and slowest times of each phone, beyond that there are a few other things to be gleaned from these results. The first is that after a reboot you will not always find your fastest results, instead, the fastest times were found after the cache was cleared, after the phone was rebooted and after using the "clear all" button in recents and killing everything running. If I were to put out a video of either of those two tests, with no context nor details surrounding the test environment and starting conditions, this is probably what the comments sections may look like:

"TL;DW – The Pixel beat the V20 by over 10 seconds in a 40 second test. LG needs to debloat and switch to NVME."

Or

"TL;DW – The Pixel and its "optimization" only beat the V20 by 1 second even with its bloat. Why do they even charge so much?"

The crazy thing is that both of those TL;DW's were correct depending on which test I decided to show; to either fit a narrative or simply because I did not do enough testing, variable control, or make sure that the starting conditions were as similar as possible. While both phones showed relatively similar levels of improvements through the tests, it can easily be seen how these results could be taken out of context, and proper context is something few if any speed testers are actually providing.

Further, why did I choose those applications? What if I used Facebook Messenger instead of Hangouts, Spotify instead of Google Play Music, or GTA III instead of Mikey Shorts? Could the results have been different? Would I suddenly hate my Pixel because the V20 may beat it in that scenario, instead of this one? As I said earlier, there are too many variables involved to make a final decision with things as simple as clearing the cache and rebooting completely changing the outcome.


Extraneous Factors & Common Mistakes

When we do our benchmarks and gather app opening speed data, we make sure to strip the phone clean of the elements we know are likely to cause interference, such as bloatware. Some phones are understandably worse than others when gathering data; for example, it was really hard for us to find reproducible DiscoMark results on the Galaxy Note 7, whereas the numbers we got for the OnePlus 3 were extremely consistent across factory resets, multiple devices, and even on different Google accounts, given the same initial conditions. With Samsung's 2016 phones, we found insane variance not just in app opening speeds, but also regular benchmarks — more than on other devices like the Pixel XL. We understand the importance of gathering data that's reproducible and as consistent as possible, even if this is often an unattainable goal when shooting for perfection. We often disclose the conditions of our tests so that the reader can get a feel for what we did to get our numbers: we disclose initial temperature, whether bloatware was removed, that we ran them after a factory reset, on the same Google account, WiFi network and surface, etc. Even then, it's hard to be fully satisfied with the testing environment, as there will always be some degree of variance.

Over the last year, we saw speed test videos becoming even more prominent than before as the number of tech YouTubers grew. They are an easy vehicle for quasi-technical insight, but many people running the speed tests show little understanding of the factors at play. For instance, most focus on the processor as the main contributor to the speed differences, when the storage is arguably a much bigger (if not the main) factor when it comes to app opening speeds (especially since many Snapdragon 820/821 phones max out all clockspeeds on all cores specifically when opening applications). This isn't to mention the misconceptions brought by the Snapdragon 821 revision. Another important aspect is the filesystem employed, something we found to make a significant difference in particular when opening heavy games, which are usually one of the biggest deciding factors of most speed test videos. For the "RAM round", the memory management solution employed is also really important: for example, you might recall how the Note 5 and OnePlus 3 were initially blasted for being terrible at this part of speed tests; at XDA, we focused on finding the root cause and a simple fix dramatically improved the situation. After OnePlus improved memory management in a following patch, the OnePlus 3 became one of the top performers in these speed tests. Out-of-memory / Low-Memory-Killer values, as well as ceilings to the amount of background apps, are things to take into account when evaluating memory management capabilities, and these can change from one build to the next. Thus, it's of great value to be able to spot whether the issue holding back performance can be addressed through a software fix, or a user modification.

Which leads us to software updates: it's very important to disclose the OS version that the tests are running on, because these can bring dramatic performance changes and essentially redefine the entire result ladder. You might recall, for example, how the Nexus 6 originally had real-world performance constraints due to forced encryption, which was promptly disabled not long after. The Nexus 6 also received a significant kernel patch early into its life that altered performance by leveraging its four cores better. Another example is that a tester running a community build of the OnePlus 3's software would unknowingly have an advantage over the default software branch should he be testing the phone with the F2FS improvements on board. So on and so forth.

Then there is the question of the applications themselves; ideally, neither phone should feature OEM applications not available on the other device, as these are coded and optimized differently, possibly yielding significant differences. There are also included services (like on Samsung devices) that run on the background for no apparent reason and can also contribute to speed differences, so minimizing background interference should be a priority for an efficient test environment. While it can be argued that these should be left alone to mimic a real-world environment, I'd say it's too volatile to control for — an example we noted in a review was how Samsung's Text-to-Speech engine was somehow hitting up to 12% of the CPU load while playing Asphalt 8, a completely unrelated task.  Apps syncing can have a dramatic effect on the resulting performance as well; a simple way to asses whether the device is ready for testing is by looking at the CPU clockspeeds when idling on the homescreen, to make sure there is nothing unduly influencing the test environment.

Then there is human error: it's rare to find a tester with both perfect vision and iron reflexes, so it's hard to really know whether there is a tacit delay affecting the results; when the test stretches for minutes on end, these little errors can add up. A small but significant factor is also the fact that touch latency varies across devices, and even across software versions, something which isn't taken into account in the string of multiple tests. Also, ROMs can ramp up the CPU frequency for the next activity after a screen tap, introducing behavior that favors some tests over others (close successioaln tests). Even worse, I've seen some speed tests effectively butcher the results due to their system settings: for example, a popular Latin American YouTuber recently had to redo an entire speed test as he had not noticed that the device's home button was set to open the camera upon a second tap; the minor delay while waiting for that second input on every home button press added up to several seconds.

These are only some of the issues we have with these tests, specifically the mistakes we've seen manifested in videos over and over. Consider that it's likely that most of these problems show up in every video in some form or another, although not always terribly so. These issues can add up unpredictably, and without specific comments on the user's methodology they are often enough to tarnish the legitimacy of the results. It's not rare, for example, to find users in comment sections complaining that their device does not behave that way, sometimes even providing concrete examples. And it's very hard to judge the behavior of these devices from merely one sample that the videomaker decided to show the public — we can't even know whether the creator bothered to test the devices multiple times before recording, to make sure that the results were consistent and reproducible. When we gather our data, we make sure to get at least 100 data points per application (it sounds like more work than it is, but remember this is mostly automated) — now imagine if I we were to show you a single result at random instead of a boxplot displaying the interquartile ranges and median. It's really impossible for the viewer to assess whether  the results shown in the video are mere outliers.


Quite Entertaining, Not Quite Insightful

So what data can be gleaned from these tests and discussion? Are we saying that you should not bother viewing these speed test videos? That's not what this article is all about at all, but we need to realize that these tests are highly volatile. As mentioned earlier, even if you use the "same apps" in the comparison, things like A/B testing on a developer's side and using the built-in, device-specific applications such as the calendar, dialer, clock and camera impact results. Software update improvements also weigh heavily on results, as OEMs can patch early bugs or make major revisions in early beta and community builds. Further as we demonstrated there is a tangible difference in performance depending on if you rebooted the device, cleared its cache or had it setup as a personal device with apps and data loaded or merely as a test unit. Finally, there is the human error element. As fast as we may be, the delay between visually seeing an app load and tapping the key changes from run to run, and device to device, and human error of some sort can be seen in almost every speed test video. All of those things can impact the final results of these speed tests more than the actual hardware, not to even mention cross-platform tests where these variables increase substantially. 

Clarkson, Hammond and Mays ran the test we spoke about in the intro time and time again, each with a different result. Mays failed the launch control on the La Ferrari a few times, Clarkson forgot to engage his wing and Hammond miscounted his 3 seconds. So what is the lesson? Like speed tests this drag race was not a feasible method of determining which car was faster as the variables involved were too great. Instead "The Grand Tour" settled the comparison with an all out race around the track, using multiple laps to find the quickest and by using a single driver to equally compare each of the cars, thus the most controlled environment they could have. The final winner was decided by less than a tenth of a second, far closer than any of the drag races and with a different outcome than most drag races.

But just because the drag race had no bearing on deciding which car was faster doesn't mean the drag race was any less fun to watch. It did mean that it was just that, fun to watch but it held little reliable data one could draw a conclusion from. The same is true with the speed tests. Yes, they can be entertaining to watch and some data can be gleaned from them in very general terms and moreso when the deltas are large, but largely they are too tied to the reviewer's decisions and the settings made prior to and during the test to hold any conclusive comparisons… with the main issue being that a large chunk of context is often missing. However, we really appreciate the medium's ability to shine light on certain performance issues, namely the memory management problems we mentioned above. Such "internet drama" can prompt OEMs to act and fix these issues sooner rather than later, which ultimately benefits us all. And we must add that we do recognize that some testers clearly do a better job than others.

Finally, maybe that isn't the only lesson to be had from our friends at The Grand Tour. The fact that they did multiple drag races and each ended up with different outcomes and then the final result was decided by such a close margin may go to teach us that today many flagships are within a performance margin of error for real world use. It also shows that regardless of which device you purchase, the applications and decisions you make likely have a larger impact than the hardware alone.



from xda-developers http://ift.tt/2hqz3E7
via IFTTT

Google Launches Website for Android’s Enterprise Presence

We already know how flexible Android is, mainly because the base platform is open source. This means it can be adapted for a wide number of different use cases and one of those is how Android is used in businesses. Not many people understand the type of impact Android can have in the enterprise, and this is why Google has decided to launch its new Android Enterprise website. It can be found at http://ift.tt/2asCTc7, and it showcases how Android can be used in the work environment.

Google has been working to get Android used in more companies with its Android for Work solution. This has been enhanced and improved over time, but Google has been working to improve Android even more for enterprise use. Along with highlighting this solution, the new Android Enterprise website also mentions devices that are built for specific purpose within company use case scenarios. They feature a couple of tablets (like the Pixel C), but also talk about how the Symbol Zebra TC75, Honewell CT50, and the CAT S60 can be used in the field.

Google then goes onto talk about specific areas where Android is currently being used, and the benefits that it has. The three areas they highlight are Retail, Healthcare and Manufacturing. For retail, Android can provide PoS systems, interactive catalogs, and proximity-based beacons for delivering relevant information or even personalized content. In Healthcare, patients can use Android to check-in and learn about procedures and post-surgery care, while clinicians can use Android for on-campus communications, scanning supplies and accessing other applications.

Manufacturing is full of purpose-built products and Android has been known to enhance this as well. Google talks about manufacturers using Android to optimize workflow, gather immediate information from the production floor, and create an agile work environment that enables informed, real-time decision making. They feel Android can save a manufacturer money by using phones, tablets, customized devices and sensors to extract and analyze a ton of data.

So be sure to check out Google's new Android Enterprise landing page if you're curious.

Source:Android Enterprise



from xda-developers http://ift.tt/2gwp56G
via IFTTT

dimanche 11 décembre 2016

Sultanxda’s Unofficial CyanogenMod 13 Lands on the OnePlus 3T through Unified Build

Many users were rightfully apprehensive of the development scene on the OnePlus 3T not picking up, as the predecessor of the device was launched just a few months ago. Users and developers had already settled for the OnePlus 3, which has a very booming ROM scene by comparison.

However, development on the 3T is also picking up. Just a few hours ago, XDA Recognized Developer Grarak tweeted about how he managed to boot CyanogenMod 13 on his OnePlus 3T, and Sultanxda offered a public release of his Unofficial CM 13 as well

XDA Recognized Developer Sultanxda's custom ROM for the OnePlus 3T is now available for download. Sultanxda is best known for his camera related works on the OnePlus One and several other development projects spread across a few devices. The custom ROM is in the form of his own modified CyanogenMod 13 builds based on Android 6.0 Marshmallow, also making use of his own custom kernel. The ROM is based off the stable CyanogenMod branch instead of nightly as stability is one of its prime focus areas.

The ROM interestingly supports both the OnePlus 3 and the OnePlus 3T, meaning that you can flash it on either device as long as you have a compatible recovery. Some of the other features are as follows:

  • OTA updates via built-in CMUpdater
  • Improved GPS speed and accuracy
  • Custom camera app featuring:
    • Photo quality comparable to OxygenOS
    • Anti-shake mode (increases the shutter speed to reduce motion blur)
    • Manual shutter speed control (1/5000th of a second up to 30 seconds)
    • Manual ISO control
    • EIS when recording video at resolutions lower than 4k UHD
    • Video HDR mode
    • Antibanding control
    • Exposure control
    • Denoise control
    • Face detection
    • HDR
    • Many other manual controls
  • Many other misc. performance and stability improvements under the hood
  • Kernel features:
    • Rebuilt from the ground up using the latest Snapdragon 821 CAF base from Qualcomm (LA.HB.1.3.2)
    • Removed lots of excessive bloat (improves security and performance)
    • Improved stability (several bugs not listed here have been fixed)
    • DASH charge
    • Dynamic CPU input boost driver I wrote myself (makes the phone feel smooth without destroying battery life)
    • Reduced display power consumption
    • Haptic feedback is automatically disabled during phone calls and video recordings
    • Improved touchscreen processing
    • Improved audio jack detection (no more weird buzzing noise and headphones are always detected on the first try)
    • CPU underclocked by default (big cluster: 2054MHz LITTLE cluster: 1593MHz) (you can disable this; read the FAQ for more info)
    • Custom thermal control driver I wrote myself (features 9 thermal throttle steps; keeps the phone cool)
    • Westwood TCP congestion algorithm (enabled by default)

There are no bugs mentioned right off the bat. Root access is not included in the ROM by default, so you will need to flash your preferred root solution separately.

For downloads and discussions, please head on over to the forum thread.



from xda-developers http://ift.tt/2hsp97K
via IFTTT

samedi 10 décembre 2016

ZenWatch 3 Review: As Smartwatch Interest Wanes, ASUS Offers a Compelling and Competitive Wear Product

Smartwatches have been on a steady decline for what feels like ages, at least in terms of the interest they garner. It has been a while since analysts even acknowledged the smartwatch as a market force, likely due to their failure to predict the so-called "Year of the Smartwatch" time and time again.

With the death of Pebble, an iconic pioneer into the modern smartwatch scene, and the delay of Android Wear 2.0, smartwatch lovers (the few out there) have seen short-term prospects of their wrist platform dwindle into irrelevance. This is further amplified by the fact that other smartwatch makers like Huawei and Motorola haven't been refreshing their smartwatches, and there don't seem to be any current plans of doing so either. Meanwhile, Samsung's wearable platform keeps rising the stakes and expanding its ambitions with MST payments and more features than you'll ever need on your wrist, but even this approach receives little fanfare and enthusiasm from the tech-enthusiast community at large.

Smartwatch makers have tried all sorts of things to attract a wider crowd — premium and luxurious designs, over-the-top feature sets, mimicking old timepieces, going full-techie, implementing e-paper displays, round watchfaces, shortcuts, gestures, sensors…. Alas, as compelling as these smartwatches can be for the notification-riddled tech-enthusiast, there is still no killer-feature to be found. This doesn't stop companies like ASUS from trying to iterate and innovate, though. Their latest ASUS ZenWatch 3 is, in this sense, the last ambitious Android Wear watch before the inevitable round of Wear 2.0 devices.

Before diving into this review, it must be pointed out that the ZenWatch 3 was originally slated to released with Android Wear 2.0. With the launch of the watch, we learned that instead it is running Android Wear 1.5, coinciding with the delay of Wear 2.0 and the extension of the respective Developer Preview. The ZenWatch 3 will most certainly obtain Wear 2.0 in the future, but although this watch doesn't bring the radical redesign of the new wearable Android version, it still brings a plethora of modifications and features that make the experience potentially different — that is, if the user cares to use the added features.


Hardware, Aesthetics & Fit

Android wear smartwatches actually offered quite a bit of variety and diversity in terms of hardware design, which allowed consumers to opt between all sorts of wearables from the gadgety-looking to the traditional premium timepiece. The ZenWatch 3 is not quite the run-of-the-mill smartwatch, but it also doesn't look like something only a mother would love. The outer shell design can very well be described as "steampunk", for a few reasons I'll note below, but not even that is quite an accurate description. Let's explore each of its elements and how they influence the overall design before making a final assessment:

edit5Starting with the front and face of the watch, we find a sort of "solar eclipse" design with a prominent gold trim around the display, and a sleek black stainless steel (316L) finish all around the body. The display is coated by Gorilla Glass 2.5 (and not sapphire glass) and the bezels around it aren't super thin, but the soft gold ring and the curve right outside of it help give the watch a slicker appearance. The top and bottom are flanked by two oversized band-holders, still slick and black but not quite as shiny and polished as the rounded sides of the watch. Unlike other smartwatches like the Huawei watch, I found these to be conveniently placed and angled — they don't hug your wrist too much, but they also don't hover over it awkwardly. Moreover, the height is appropriate so that it doesn't grant the watch an appearance of being thicker than it is, like the original Moto 360 did precisely because the bands held the wrist so tightly, with the entire watch rising above that.

Moving onto the sides, you'll find that the the ZenWatch 3 comes with three prominent buttons that stick out of the body very noticeably. They all look like buttons you'd find on a classic timepiece rather than a modern watch, giving the side of the phone an industrial look that compliments that "steampunk" vibes accentuated by the bronze-like motif of the gold trim. The middle button itself bears a hint of gold in the form of a small ring around it as well, and all three buttons are solid and springy, but not clicky, which is a perceptible difference in the tactile feedback of other smartwatches. The three buttons are programmable, which we'll discuss in the software section, and they are arguably one of the stronger selling points of this hardware package.

edit6Hopping onto the back, we find a solid plate and speakers as well as the charging pins for fast charging (discussed in the user experience section), and while these are magnetic (they work well, though), you still get the night-stand clock functionality prominently featured in the Moto 360 smartwatch line and their wireless charging cradles. As you can see, there is no heart-rate monitor here. I must also point out that the watch is rather thin at 9.95mm, and it certainly looks and feels thinner than many competing smartwatches. The included bands are made of a very dark brown stitched leather that passes off as black under most lighting conditions, and I found it to be very comfortable and serviceable, although it's worth noting that the band attachment mechanism is proprietary, meaning finding replacements is not as easy as looking for a standard 18mm band.

edit1I ultimately found the hardware to be an attractive if unconventional alternative, and I've gotten many compliments and inquiries regarding the watch as well. Some software decisions neatly complement the aesthetic, in particular the included watchfaces, and I must say it's one of the most comfortable smartwatches I've worn. I haven't had issues fitting it into sleeves and it doesn't look too over-sized on my admittedly-thin wrist, and it can even pass as a regular watch to many people depending on your watch face. The steampunk vibes might not fit all styles, though, and it's certainly not as widely-appealing of a design as other smartwatches, which are easier to match clothes to if that's something you care about.

edit4What about functional and internal hardware? Most items on its specification sheet are what you'd expect out of your average Android Wear smartwatch. First things first, it does count with IP67 certification for dust and water resistance, up to 1 meter and 30 minutes. The screen is a 1.39 inch AMOLED panel with a relatively high 400×400 resolution, at 287 pixels-per-inch, with no flat tire despite having a handy ambient-light sensor. I found the screen to be colorful and bright under most lighting conditions, although the ambient light sensor wasn't as quick to act and adapt as I'd hoped, meaning I found myself manually adjusting brightness a few times — it tends to be to set brightness too low for my eyes as well. 

Finally (because this is a smartwatch, after all), this device is one of the first to pack the new Snapdragon Wear 2100 chipset, which promises up to 25% lower power consumption (we'll discuss performance in the following section), alongside a standard 512MB of RAM and 4GB of internal memory. It does not have, NFC, GPS or cellular connectivity, but supports WiFi and Bluetooth Low-Energy v4.2. It also does not have a heart-rate monitor despite ASUS' emphasis on fitness with their dedicated ZenFit app.


Software and User Experience

The ZenWatch 3 is packed the to the brim with features, in a way that most smartwatches are not. While Android is an open platform, Android Wear is not — this is something with faults and benefits we've debated before, and ASUS isn't the first OEM that wanted Wear as a platform to be better-tuned to their product. But those minor changes aside, the experience is very much what'd you'd expect out of Android Wear, and I'd argue it's the additions that ultimately make the user experience stand out. ASUS' leveraging of their hardware assets, such as their masked ambient light sensor and three-button configuration, are some of the key differences (and perhaps advantages) of this smartwatch.

Starting with the UI, we find that this version of Android Wear still functions just like you'd expect and it sadly doesn't bring the features we were promised with Android Wear 2.0. The ZenWatch 3, however, manages to run the core functions really well — performance is on par with other Wear devices (bar the original Moto 360), the gestures actually work better than older Wear watches too, and it generally ticks at a fast rate. The screen is a little on the warm side, something that makes some white icons and white text look out of place, but other than that the screen works really well with Android Wear. The display experience has been above the average for two reasons, in my opinion: first, it is a fully circular AMOLED display with an ambient light sensor (no flat tire like we saw on multiple other watches), and second, because ASUS' included watchfaces make good use of the rich display by not "gimping" the ambient mode.

edit3

Screen brightness is appropriate outdoors, but sunny days will give it a challenge

That last point might be one that turns off wary customers afraid of burn in or battery life duration, but so far I haven't experienced neither of those two issues. Granted, the former would take far longer than two weeks to manifest itself, but the always-on mode of the stock watchfaces hasn't really made the device's battery life too bad for my use case. The ambient version of the stock watchfaces still doesn't update as frequently (i.e. no seconds hand or ticker, or moving elements), and they make the phone look even better when idling. There are over 50 watchfaces after downloading the companion app, most of them with classic watch designs featuring brown, dark grey and gold accents to compliment the watch's physical aesthetic, and then there are some more abstract watchfaces similar to Moto's rotational watchface as well. Below are some of my favorite included watchfaces:

fac1 face2 face3 face4

Of course, you can also pick your favorite watchface from the Play Store and make one through facer, but I do think ASUS did a good job with the bundled options as they make good use of the screen in conjunction with the watch's design language. And that's the literal surface of what Asus offers on top of the base Android Wear provides, as the company bundled in multiple features, most in the form of watch apps, and a dedicated watch manager app can be downloaded to further customize and add functionality to the watch.

zwshortcut

Beginning with my favorite feature, the buttons on the ZenWatch 3 are programmable to launch applications via the included watch app (after downloading ZenWatch Manager, you can customize both buttons). By default, the top button launches ZenFit (ASUS' fitness tracking app) and the bottom one allows you to enter ECO Mode (more on that later), but being able to customize both can lead to some useful shortcuts. For example, I set one button to launch the Hangouts app in order to quickly check my group chats and messages, as well as reply to any pending chats. I've also used this feature to keep track of my location through Google Maps while going through new bus routes and streets, and it's also useful when set to launching your calendar or agenda. Moreover, I can imagine other useful situation for certain people or under certain days, such as quick access to a stopwatch, calculator or Google Translate. The watch does a good job at keeping those watch apps in RAM, too, making the transition fast and fluid.

2825781280571584284-account_id1To unlock the full feature potential of the ZenWatch 3, one needs to download the "ZenWatch Manager" app from the Play Store. This grants you access to over 50 exclusive watchfaces, most of which aren't very attractive, but many of them do fit the aesthetic nicely — some even reinforce the "steampunk" vibe by displaying moving cogs (albeit at a slow framerate). Some of the key features include grouping your favorite watch faces, a selection of watch apps (by ASUS), the ability to cover the ASUS clock on your watch to mute alarms, watch finder (uses vibration and sound), better remote call control, an SOS app (can be tied to button shortcuts), forgotten phone warning, and a flashlight app (which the Play Store is riddled with). Finally, there is a remote camera app to install, and a watchface designer that allows you to configure ambient mode in watchfaces as well. They also include a shortcut to smart lock and call it a feature, but it's really just a shortcut.

I haven't used many of the ZenWatch manager's features, but I did like the watchface customization screen to configure my information widgets (fitness metrics, battery stats, etc). The ones that worked the best for me are the ones integrated into the UX passively, like the call manager which worked better than the default solution while answering calls out and about. I haven't used the speaker for calls, although it gets surprisingly loud when playing alarms or music.

zenwatch-3

269885331875883727-account_id1 As for battery life, I managed to get over 24 hours out of the ZenWatch 3 with no real issues, and that's being a notification-heavy user. I also ran the detailed ambient mode, and none of my watchfaces featured deep blacks for AMOLED battery savings. I did leave my watch on overnight every time, so I could have probably squeezed close to two work days of usage by turning it off at night — I just haven't felt the need to, because the ZenWatch features really, really fast charging. This is one of my favorite features: ASUS claims that you can charge up to 60% in 15 minutes, and my testing confirms it (a full charge takes about 40 minutes). It does indeed charge very fast, fast enough that 15 minutes in the morning are enough to power a solid work day.

I haven't felt that the battery life was exceptional, but the fact that there is an optional battery shell (which does add thickness, but adds charge nonetheless) means there is at least an option to extend battery life on the go by about 40%. Finally, on top of the default battery saving functionality there is an "ECO mode" that ASUS claims can double your battery life (with the penalties you'd expect out of an "ultra stamina" mode). Having those options available can probably help with battery anxiety, but I honestly haven't felt limited by the watch's endurance, in part because of the convenient charging speeds.


A Serious Competitor

The ZenWatch 3 is a nifty Wear device, and I'd go as far as saying that it's the most complete Android Wear package out there. ASUS provides a good-looking watch with extensive software options, some of which are actually very useful and synergize well with the hardware. It doesn't compromise the screen's circumference for an ambient light sensor, it has useful hardware shortcuts… it's light, thin, and comfortable. It does outclass previous generations by packing the Snapdragon Wear 2100 with a standard battery, although the gains are not necessarily noticeable in day-to-day usage. The charging mechanism is comfortable to use, too, and wicked fast — I only wish the included charger's cable was longer, or that it wasn't fused to the cradle. In terms of hardware, the main shortfall is the exclusion of key sensors like a heart-rate monitor, a GPS and NFC, the latter likely becoming a compromise once Android Pay inevitably hits Wear.

The premium design is well-realized and certainly like something you'd expect above the $229 price tag, which not only is lower than most competitor's launch prices, but also lower than their current market prices. This makes the ZenWatch 3 extremely competitive in a market that is lacking renewed options, but there are a few aspects that might detract potential customers: the design doesn't flow quite as well with all styles and I'd say it has more-limited appeal than that of other smartwatches. And while it packs many hardware advantages, we are on the brink of a new smartwatch generation that's bound to up the ante even further. Nevertheless, the price is what makes it for me — it's a very compelling option for those looking to buy a smartwatch at this moment in time. Admittedly, there isn't a huge market for Wear smartwatches right now, and even tech bloggers and reviewers are increasingly freeing their wrists from another buzzing machine. But I am not, and I'm being fair when I say I've enjoyed the ZenWatch 3 more than my Gear S2 and Moto 360, even if none of these devices add indispensable value to my life.

Check Out XDA's ZenWatch 3 Forum >>



from xda-developers http://ift.tt/2gMMahS
via IFTTT