Page 790 of 1210 FirstFirst ... 290690740765780785786787788789790791792793794795800815840890 ... LastLast
Results 7,891 to 7,900 of 12096

Thread: Anandtech News

  1. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,809
    Post Thanks / Like
    #7891

    Anandtech: Intel Launches Stratix 10 TX: Leveraging EMIB with 58G Transceivers

    One of the key takeaways from Hot Chips last year was that Intel’s EMIB strategy was going to be fixed primarily in FPGAs to begin with. Intel instigated a bridge and chiplet strategy: with the right FPGA, up to six different chiplets can be added via EMIB in a single package. This is well beyond the single EMIB implementation coming to consumers. Intel has already been selling its Stratix 10 family of FPGAs that have additional functionality through EMIB for a number of months, however today Intel is announcing the latest member of that family, one that has the ability to include 58G transceivers: the Stratix 10 TX.
    Intel will offer different variants of the Stratix 10 TX, from 600k logic elements with two chiplets up to 2.8 million logic elements with six chiplets. Five of the tiles are capable of taking the new 58G transceivers, enabling up to 144 transceivers in a single package. The sixth tile is used for PCIe. While the central FPGA is built on Intel’s 14nm process technology, the transceivers will be built upon TSMC’s 16FF process, due to Altera’s history of using TSMC for its analog hardware. Intel states that the transceivers in play can go down to 1 Gbps if needed, enabling backwards compatibility with existing networks. These transceivers were first demonstrated back in March 2016, finally making it to market.
    Intel is calling the new transceivers dual-mode 58G, with 4-levels of pulse-amplitude modulation (PAM4) and 30 Gbps non-return-to-zero. Previously these were (and have been called) 56G, but Intel is using the term 58G because it believes that its solution will perform better than other 56G solutions on the market, enough to call it a 58G device.
    The aim for these new Stratix 10 devices is for driving networking bandwidth needs for centralized base stations and network virtualization. The latter here is critical for 5G implementation, where compute happens in edge networks and VMs are transferred around to enable compute that has to happen closer to the device, especially with software defined networking. The Stratix 10 also features a range of IP cores, for 100GbE MAC and FEC, and the transceivers are built through a range of Intel and third-party IP.
    We are told from Intel that using EMIB in conjunction with the new chiplets enables for a much lower power product than previous designs. Intel offers customers substantial training, but EMIB implementations are essentially transparent to the design: the upside being bandwidth related and the ability to create use cases from that functionality.
    Alongside the Stratix 10 TX, Intel will also be offering a TX version with HBM2 (high-bandwidth memory). Within the largest design, the package can have two 4GB HBM tiles paired with three 58G transceivers and one 28G transceiver. That last one is 28G because it also has a hard PCIe connection. The HBM2 memory can run at either 800 MHz or 1 GHz depending on the product.
    Intel is shipping the Stratix 10 TX family from today.
    Related Reading





    More...

  2. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,809
    Post Thanks / Like
    #7892

    Anandtech: Samsung AR Emoji: It Almost Looks Like Me, I Guess

    BARCELONA, ESP – One of the ‘innovations’ of the latest generations of smartphones have been interactive avatars that devolve into emoji. By using a bit of photography and some real-time face mapping, users on certain devices can create mini-avatars of themselves or emulate other non-humanoid avatars. The way that the companies differ is in the implementation of these features, how interactive they can be, and how they are presented. As part of the Samsung S9 launch event here at Mobile World Congress, I tried the ‘AR Emoji’ feature. It went something like this.
    After taking a single selfie, the system took about 20 seconds to map the image to a model. It asks if the model is male or female, and then the user can fine-tune things like skin color, hair color, hair style, and accessories like glasses. It certainly uses better algorithms than some ‘create your own character’ features used in some sports video games, and the software was even accurate enough to pick out the red pimple that was developed on the right of my nose. It got my hair as being from right to left as well, although it did put my hair as grey.
    All of this seemed reasonable – as an interactive character, it did follow my facial features such as cheekbones, eyebrows, and chin. Where it fell down, however, is with the eyes. It did not react to my eyelids at all, and it seemed (when looking at other people’s AR Emoji) that the eyes were a fixed feature on all implementations. It slowly became a sort of cold death stare that makes the most joyous of people into workaholics with no emotion. There’s also the aspect of not reacting to a tongue, either.
    This is one of the problems when dealing with life-like reality enhanced avatars: even the slightest thing that seems off will seem obviously so. The human mind is trained and has evolved so well to recognize human faces, and the defects within, to a sometimes unnerving degree. This is why drawing faces can be difficult – one slight mishap and it looks very wrong. Apple skirted around this issue by keeping its ‘Animoji’ limited to animals only, and not dealing with actual human-like avatars. But as I said before, the implementation that Samsung has is definitely better than any video-game ‘create a character from your picture’ tool that I’ve ever seen.
    The goal of the AR Emoji is similar to how other systems like this are done – use the avatar to create custom emotions and messages that can be shared with other users. In the iOS ecosystem, this is fairly easy, but for Android this is a little more difficult, requiring all third party apps to use the same API calls in order to do so. In order to use AR Emoji, Samsung has a custom implementation, meaning that applications have to partner with Samsung in order to use them. Nonetheless, the Galaxy S9 does support ARCore. Samsung also provides a series of 15 emoji stickers based on the avatar that can be used with popular chatting applications like WhatsApp, WeChat, and Line.

    Real Me vs Avatar Me
    I think what surprises me about the feature is that with a single photo, it made a reasonable attempt at an avatar in around 20 seconds. I have seen other journalists report on the feature with less than desired results in how their avatars turned out. This is in contrast to how Sony did its 3D modeling feature, shown last year, by which a panned scan from ear to ear is needed to get a model. I think that in this case, Samsung should offer the opportunity to do a more detailed scan of a person by using multiple pictures, to help get color and dimensions more accurate.
    Related Reading




    More...

  3. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,809
    Post Thanks / Like
    #7893

    Anandtech: Apacer Adds DDR4-3600 CL17 and DDR4 3466 CL16 Memory to Commando Series

    The new Apacer Commando series DDR4 comes in 16 GB (2x8GB) dual-channel kits with frequencies reaching up to DDR4-3466 CL16 and DDR4-3600 CL17. The sticks were announced late last year but with different, looser (CL18), timings. The new SKUs run at slightly tighter timings with the same voltage and same unique heat spreader from their Armory design which looks like the rail and iron sites on some assault rifles. The kits do not include RGB LEDs but rely on the unique design aesthetic of the heat spreader to set itself apart from the rest.
    The Commando Series from Apacer uses a gun theme on these sticks with an AR body printed on the side of the heatsink. The heat spreader, outside of the gun impressed on it, is black with the Apacer and Commando names printed on the side in white and red respectively. The top rail and iron sites extend up just a bit over the memory’s black PCB and rest of the heatsink, but this design shouldn’t get in the way of too many CPU coolers. As always, check on the dimensions for proper fitment.
    We are unsure what IC’s are exactly under the hood, however, judging from the timings/speed, as well as Apacer’s propensity to use SK Hynix in the past, that is where I am placing my bet. . As mentioned earlier, the new revision of the DDR4 3466 sticks hit the scene with tighter timings at 16-18-18-36 versus 18-18-18-42. The DDR4 3600 kit come in at 17-19-19-39 with both of the kits supporting XMP 2.0 profiles and running at 1.35V.
    The new sticks should be a good fit for the latest Intel platforms as they are able to support these speeds, typically, by simply enabling the XMP profile. AMD’s AM4 platform has been maturing nicely over the past several months on many fronts, including memory compatibility, but kits this fast may be overkill for that platform and also require manual entry of speed and timings. Perhaps Zen+ will change that when it releases here in the coming months.
    Both kits and the rest of the DDR4 Commando series are available now. We were unable to find the new kits here in the US, however. The DDR4 3466 CL16 sticks were priced at 254€ (~$312+) and the DDR4 3600 will cost 379€ (~$465+). Compared to other DDR4 3600 CL17 kits, these are priced on the high side as many at Newegg are around the $250 mark, some of which include RGB LEDs and have lower timings.
    Apacer Commando Series
    Speed CL Timing Voltage Capacity
    *DDR4-3466 CL16 18-18-36 1.35 V 8GB / 16GB / 32GB
    *DDR4-3600 CL17 19-19-35 1.35V
    DDR4-2400 / DDR4-2666 CL16 16-16-36 1.2 V 8GB / 16GB / 32GB
    DDR4-2800 CL17 17-17-36 1.2 V 8GB / 16GB / 32GB / 64GB
    DDR4-3000 CL16 18-18-38
    CL16 -16-16-38
    1.35V
    DDR4-3200 CL16 18-18-38 1.35V
    DDR4-3466 CL18 18-18-42 1.35V 8GB / 16GB / 32GB
    * - Denotes new SKUs
    Related Reading:




    More...

  4. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,809
    Post Thanks / Like
    #7894

    Anandtech: EKWB Releases Full Cover GPU Water Block for Titan V

    EKWB is releasing water blocks for NVIDIA’s flagship GPU offering in the Volta-powered GeForce Titan V. The Titan V, released in December 2017, isn’t pegged to be a gaming card, but it is clear EK appears to believe a market exists for those who would like additional cooling on their $3000 video card. With this, they created the EK-FC Titan V GPU block. The EK-FC Titan V GPU block will cool the GPU, HBM2 memory, and power delivery with water channels running directly over these critical areas. Keeping these areas cool, particularly the GPU core, can help the GPU maintain higher clock speeds/boost bins. EK says this allows the graphics card and its VRM to “remain stable under high overclocks and able to reach full boost clocks.”.
    The new EK-FC Titan V block is made of nickel-plated electrolytic copper with the top available in two color choices – Acrylic (clear) for those who like to see the internals, or POM Acetal material (black). The Titan V block features central inlet split-flow cooling engine which is said to offer good hydraulic performance and also works with reversed water flow without affecting cooling performance. This feature will allow a bit more flexibility in loop design since either port on the block can be used in or out.
    EK also includes a single-slot IO bracket to replace the original two-slot to transform the GPU in a single slot solution which can make the installation of multiple graphics card easier. In addition to the two Titan V block options, EK will also sell backplates for the card separately. They will come in two choices, either Black or Nickel. The backplates help with aesthetics, hiding the back of the video card’s PCB, as well as provides additional passive cooling to the GPU core and power delivery sections. Fittings (sold separately) will be the standard fare G1/4” threading.
    The EK-FC Titan V water blocks and backplates are available for purchase through the EK Webshop and their Partner Reseller Network. Both backplates are available through pre-order now with the black model shipping Tuesday, February 27th, while the Nickel version will start shipping Monday, March 5th.
    EKWB EK-FC Titan V GPU Block
    Name MSRP (inc. VAT)
    EK-FC Titan V 129.95€ / $149.99
    EK-FC Titan V - Acetal + Nickel 129.95€ / $149.99
    EK-FC Titan V Backplate - Black
    33.95€ / $38.99
    EK-FC Titan V Backplate - Nickel
    39.95€ / $46.99
    Related Reading:





    More...

  5. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,809
    Post Thanks / Like
    #7895

    Anandtech: NVIDIA Releases 391.01 WHQL Game Ready Driver

    Ahead of next week’s game launches, NVIDIA today released driver version 391.01 WHQL, featuring Final Fantasy XV Windows Edition (3/6) and Warhammer: Vermintide 2 (3/8) as Game Ready headliners. The patch also brings support for World of Tanks 1.0, an upcoming major overhaul of the game engine, as well as performance optimizations for PlayerUnknown’s Battlegrounds (PUBG), with NVIDIA citing internal testing on Pascal graphics cards. Rounding things out are a fair amount of miscellaneous bug fixes.
    For Final Fantasy XV’s impending arrival to PC, a 21GB playable demo (Steam, Origin, Microsoft Store) was released just today as a preview of the full game, including HDR10 support as well as high-resolution 4K textures that will be optional in the full release. Also bringing support for ShadowPlay Highlights and Ansel, NVIDIA worked with Square Enix in supporting a number of GameWorks features for Final Fantasy XV, including Flow, HairWorks, Hybrid Frustum Traced Shadows, Turf Effects, and Voxel Ambient Occlusion (VXAO). While a standalone benchmark was already released earlier this month, it had among other issues un-customizable graphical presets, some of which automatically enabled NVIDIA GameWorks settings. The demo and full release allow the usual settings customization available to PCs, though some presets may still enable GameWorks by default.
    Note that in addition to the PC enhancements for Final Fantasy XV Windows Edition, different digital stores are including different pre-order bonuses.
    Meanwhile, alongside support for Vermintide 2 and World of Tanks 1.0, NVIDIA has focused on PUBG for optimizations, claiming 3 – 7% 1080p performance improvements for the GeForce GTX 1050 and above, presumably comparing to the previous driver release. At 4K, NVIDIA cites 6 – 7% 4K performance uplifts for GTX 1070 cards and above, and at 1440p 5 – 6% uplifts for GTX 1060 3GB (1152 cores) and above.
    On the side of bug fixes, NVIDIA has resolved the following issues with 391.01:

    • Dynamic reflections flicker in BeamNG
    • Flickering shadows occur in Call of Duty WWII
    • “NvfbcPluginWindow” prevents Windows shutdown
    • Cold booting results in black screen on multi-monitor configurations
    • Enabling Stereoscopic 3D (3DVision) increases system shutdown time
    • After rebooting the system, the NVIDIA Control Panel “Content type” setting (found under Display à Adjust desktop color) is reset to “Auto-selected”
    • OpenGL program may crash when trying to map a buffer object on GeForce GTX 980 and 1080 Ti
    • For notebooks, GeForce GTX 965M performance drop occurs

    The list of open issues has not been changed for 391.01. And touching quickly on SLI, NVIDIA has updated/added profiles for Agents of Mayhem and PixArk.
    Wrapping things up, 391.01 supports the latest GeForce Experience 3.13.0.85 beta released last Friday, which addresses GameStream connectivity under poor network conditions and crashes when launching ELEX through GameStream.
    The updated drivers are available through the GeForce Experience Drivers tab or online at the NVIDIA driver download page. More information on this update and further issues can be found in the 391.01 release notes.


    More...

  6. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,809
    Post Thanks / Like
    #7896

    Anandtech: Hands-On With the Nokia 8 Sirocco: Reviving Nokia's 'Dream Phone' Concept

    BARCELONA, ESP — Nokia on Sunday introduced a new top-of-the-range, styled 'dream' phone, the Nokia 8 Sirocco. Nokia quoted it as perhaps one of the most awaited addition to its latest lineup of smartphones. The newcomer is based on the Qualcomm Snapdragon 835 SoC, comes with a pOLED display, and features the brand’s latest camera, which is also found on the Nokia 7 Plus handset. A key selling point of the new device, besides its performance and feature set, is the fact that it comes in IP67-rated stainless steel body, processed using diamond cut technique.

    More...

  7. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,809
    Post Thanks / Like
    #7897

    Anandtech: Qualcomm Announces Snapdragon 700 Series Platform: Carving Out A Niche for

    As part of their Mobile World Congress 2018 presentation this morning, Qualcomm is ever so slightly taking the wraps off of a new tier of Snapdragon platform SoCs. Dubbed the Snapdragon 700 Mobile Platform Series, the new tier in the Snapdragon SoC family is meant to better bridge the gap between the existing 600 and 800 series, offering many of the latter’s premium features at a lower price point. However just how well it bridges that gap remains to be seen, as Qualcomm is not announcing any specific processor configurations today, just the existence of the new tier.
    Currently Qualcomm’s SoC stack is split between four lines: the non-Snapdragon entry-level 200 series, and then the Snapdragon-branded 400, 600, and 800 series, encompassing the mid-range to performance markets respectively. With the addition of the 700 series, Qualcomm is essentially further sub-dividing the Snapdragon family, carving out a sub-premium brand below the flagship 800 series, but above the current 600 series.
    In their short press release, the company is stating that the goal of the new Snapdragon 700 series platforms is to offer the type of premium features found in the 800 series SoCs in a cheaper part for lower-priced devices. The usual price/volume logic aside, Qualcomm’s press release specifically notes the China market as being a focus, which for Qualcomm makes quite a bit of sense given its continued rapid growth and somewhat lower purchasing power parity than the western markets where flagship 800 series-based phones dominate. As for what those features will be, Qualcomm is making special mention of their AI feature suite – the Qualcomm AI Engine – though the underlying CPU/GPU/DSP components are already part of existing 600 series SoCs as well as the 800 series. None the less, it’s clear that Qualcomm is looking to establish a tier of SoCs that are slower, cheaper, but still at or near feature parity with the 800 series.
    Unfortunately as Qualcomm isn’t announcing specific SoCs at this time, hard technical details are few. Of the handful of figures included in Qualcomm’s announcement, they compare it at multiple points to the Snapdragon 660, touting 2x the “AI performance” and 30% better power efficiency than the fastest member of Qualcomm’s current 600 series stack. The announcement also notes that the company will be using new architectures for the CPU, GPU, and ISP blocks, so for the new parts we’re expecting to see versions of Qualcomm’s Kryo 385 CPU, Adreno 600 GPU, and Spectra 200 ISP respectively. On which note we’re still waiting for the first Snapdragon 845 devices to ship, but based on our early impressions, the Adreno 600 series GPU architecture in particular has proven quite capable and could turn some heads in a cheaper SoC as well.
    With all of that said, while Qualcomm is pitching this as a new product offering, after chatting with our always awesome senior mobile editor Andrei Frumusanu, I suspect what we’re seeing here is not Qualcomm commissioning a fifth line of SoCs, but rather bifurcating the 600 series. Whereas the 800 series has in recent years consisted of just a single current-generation design – i.e. the newly launched Snapdragon 845 – the 600 series has typically offered 2 or 3 different chips, sometimes with widely differing specifications. Case in point, the current Snapdragon 630 is a Cortex-A53 part while the Snapdragon 660 includes a quartet of high-performance Kryo cores. Splitting these into more distinct mid-range and sub-premium tiers would likely help Qualcomm and its partners better differentiate the two and position the 700 series as a more powerful option without having its lower-performing sibling muddle things. So following today’s announcement – and especially the Snapdragon 660 comparisons – I wouldn’t be in any way surprised if the rumored Snapdragon 670 ends up being a 700 series part, while its 640 counterpart remains in the 600 series.
    Anyhow, while Qualcomm isn’t talking about a shipping date at this point, they are announcing that commercial sampling for the Snapdragon 700 series will kick off in the first half of this year. So like 2017’s Snapdragon 660 and 630, I would expect that retail devices containing the new SoCs will show up before the end of the year.



    More...

  8. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,809
    Post Thanks / Like
    #7898

    Anandtech: Western Digital Demos SD Card with PCIe x1 Interface, 880 MB/s Read Speed

    BARCELONA, ESP — Western Digital demonstrated an experimental SD card featuring a PCIe Gen 3 x1 interface at Mobile World Congress. Meanwhile, the SD Card Association is calling upon the industry to adopt PCIe as a standard interface and to support the development of a complete SD PCIe standard.
    Western Digital is demonstrating a system featuring an M.2-to-SD adapter with an SD card that offers 880 MB/s sequential read speeds as well as up to 430 MB/s sequential write speeds, according to the CrystalDiskMark benchmark. The drive uses the existing UHS-II/III pins to construct a PCIe 3.0 x1 interface with the system (via a mechanical adapter) and probably standard PCIe voltage with a converter. The company is not disclosing the type of memory or the controller that power the SD PCIe card, but it is clear that we are dealing with a custom solution. Meanwhile, Western Digital claims that the implementation costs of a PCIe interface is not high as one might expect, as a PCIe x1 PHY is not all that large.
    Western Digital further notes that the SD card with a PCIe interface is not standard and will not hit the market any time soon, but is showing off the concept anyhow as they have seen interest from certain parties for this kind of removable storage solutions. For example, makers of industrial or special-purpose PCs could benefit from flexibility provided by removable industry-grade SD cards with PCIe 3.0 x1 performance (i.e., they are faster than SATA, easier to install than eMMC/UFS/BGA SSD).
    The company believes that once the SD standard moves to PCIe, its evolutionary path will be pretty straightforward: after PCIe 3.0/3.1 come PCIe 4.0 and PCIe 5.0 that increase performance per lane from 8 GT/s to 16 GT/s and 32 GT/s, respectively.
    In the meantime, the SD Card Association is calling for interested parties to participate in the creation of a proper PCIe SD/microSD card standard. The organization outlines key features of the upcoming standard on a banner in its booth:

    • Existing form-factors.
    • PCIe interface, NVMe protocol.
    • Support of legacy SD interface for backward compatibility (since the existing SD/PCIe card uses UHS-II/III pins, it is logical to expect the upcoming standard to support appropriate signaling too).

    The Secure Digital interface has evolved greatly over the time of its existence in terms of performance: from 54 – 104 MB/s supported by the UHS-I all the way to 624 MB/s (full duplex) supported by the UHS-III. PCIe 3.0/3.1 can increase performance further to around 985 MB/s, beating the UHS-III.
    One of the key problems that the SD Card Association and its member face is support for UHS-II/UHS-III in host devices. Smartphone makers are reluctant to support the latest UHS standards and PC makers rarely incorporate fast card readers into their products as the only devices that use the UHS-II are higher-end DSLR cameras, so mainsteam users barely use/need UHS-II slots. When the SD standard adopts PCIe, manufacturers of various special-purpose PCs/servers will benefit, but producers of consumer electronics may (again) be unwilling to incorporate new controllers into their products due to lack of immediate benefits and power consumption concerns. Nonetheless, since there are applications that can benefit from fast SD/microSD cards, the standard will be developed several years down the road.
    Related Reading:




    More...

  9. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,809
    Post Thanks / Like
    #7899

    Anandtech: MWC 2018: ASUS Press Event Live Blog (6:30pm UTC, 1:30pm ET)

    We're here in Barcelona covering ASUS's press event- expect the new ZenPhone 5!

    More...

  10. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,809
    Post Thanks / Like
    #7900

    Anandtech: ASUS Launches the ZenFone 5: Now With Added Notch

    BARCELONA, ESP – Today ASUS is launching a smartphone that is designed, according to the speaker at our prebriefing, to make it look like the user is holding an iPhone X. The new ASUS ZenFone 5, part of the ZenFone 5 family, comes with a notch. Apparently this is what the company says that its customers want: the ability to look as if you have an iPhone X, but have something else.
    Aside from the notch, ASUS’ ZenFone 5 launch today actually comprises of three different devices: the ZenFone 5, the ZenFone 5 Lite (called the ZenFone 5Q in most of the world) and the ZenFone 5Z. We were given a pre-brief of the first two devices, with the flagship ZenFone 5Z only being revealed at the press event at Mobile World Congress.
    The ZenFone 5

    The Zenfone 5's
    Zenfone 5 Lite Zenfone 5 Zenfone 5z
    SoC Snapdragon 430
    Snapdragon 630
    Snapdragon 636 Snapdragon 845
    4x Kryo 385 @ 2.8GHz
    4x Kryo 385 @ 1.77GHz
    Adreno 630
    Display 6-inch 2160x1080 (18:9)
    LCD
    6.2-inch 2246x1080 (19:9)
    LCD
    Dimensions ? x ? x ? mm
    168.3 grams
    ? x ? x ? mm
    155 grams
    RAM 3/4GB LPDDR4 4/6GB LPDDR4 4/6/8GB LPDDR4
    NAND 32/64GB 32/64GB 32/64GB
    Battery 3300 mAh
    non-replaceable
    Front Camera 20MP Sony IMX376
    f/2.0

    Secondary 120° wide angle module
    20MP Sony IMX376
    f/2.0

    Secondary 120° wide angle module
    Front Camera 20MP Sony IMX376
    f/2.0

    Secondary 120° wide angle module
    8MP f/2.0
    Primary Rear Camera 16MP
    f/2.2
    12MP 1.4µm Sony IMX363
    f/1.8
    83° FoV / 24mm equivalent
    Secondary Rear Camera 120° wide-angle module
    SIM Size Dual NanoSIM + microSD NanoSIM + NanoSIM/microSD
    Connectivity microUSB 2.0 USB Type-C
    Launch OS Android 8.0 with ZenUI 5
    ASUS’ history with the ZenFone family has gone through highs and lows, from the ever popular ZenFone 2 through to the mass of dozens of ZenFone 3 variants that were difficult to keep track of. For ZF5 ASUS is sticking to three, and the standard ZF5 is set to sit in the middle of that stack. Despite sitting in the middle, ASUS was keen to point out some of the high profile features.
    The display specifications involve a 6.2-inch display, listed as a 19:9 aspect ratio, and a Full-HD+ resolution (2160x1080). ASUS is quoting 500 nits of brightness, a 1500:1 contrast ratio, and support 100% of the DCI-P3 color space. The display also comes with Glove Touch support, which can be enabled through the settings. ASUS has supposedly implemented AI here, as part of its AI strategy, such that the system can use an onboard RGB sensor and apply screen color/brightness adjustments to react to ambient lighting. We were told that this goes beyond just regular brightness and dimming in bright light, but when pressed, we were told that this ‘artificial intelligence’ feature was not based on any form of machine learning. More on this later.
    Aside from the display, ASUS wanted to highlight its camera solution. The handset has a dual rear camera setup, with the second camera being a wide-angle lens. The main rear camera is a 12MP Sony IMX363 sensor, with an f/1.8 aperture, 1.4 micron pixels, and OIS/EIS. The secondary rear camera is just listed as an 8MP unit with F/2.2 aperture and a 120-degree field of view. Underneath is a fingerprint sensor.
    The rear cameras are combined with machine learning software, which will detect up to 16 different scenes and objects, including people, food, snow, stage, flowers, text, and others. The system will apply a series of color saturation, white balance, exposure, and sharpness/noise adjustments to the camera settings to provide a better shot.
    On top of helping with taking shots, this AI is going to be combined with another AI related to photo learning. For post processing, the system will offer a number of suggested edits to the picture, and ask the user if they want to accept the changes. Over time the idea is that the system will understand the preferred settings of each user for each different type of scene and use them to help manage the pre-shot AI.
    The front facing camera is an 8MP unit as well, but with f/2.0 aperture. This camera will be used for ASUS’ Face Unlock feature, which was explained as being important for environments that use gloves a lot (our presenter was based in the Nordic countries).
    On the SoC side, the ZF5 is using a Qualcomm Snapdragon 636 processor, which has eight Kryo 260 cores running at 1.8 GHz, built on Samsung’s 14LPP process, and with Adreno 509 graphics. This SoC will be paired with up to 6 GB of LPDDR4X, and up to 128 GB of storage.
    ASUS was keen to point out the audio functionality, with dual NXP9874 smart amps in place – one for each speaker. The effect in person over the previous ZenFone 4, and some other flagship devices, was substantial for speakers in a regular position for a smartphone. ASUS also stated the ZenFone 5 as the world’s first smartphone with DTS Headphone X support, providing a 7.1 virtual surround sound. These features tie into ASUS’ AudioWizard tool that offers different listening profiles depending on what headphones are used – an external audio tuning company has enabled profiles for over 400 types of commercial headphones and when enabled on the ZF5 will adjust the frequency response accordingly. There is also a personal 60-second test to help the system adjust.
    Another feature is ‘AI Charging’, the idea that most users charge their phone every night at a similar time, so the system will learn the usual method and adjust the charging profile to match. The key here is that batteries last longest when they are mostly charged, rather than fully charged. So rather than spending the first two hours in the night charging, and staying at 100% the next six hours, the system will hold the battery level at 80% and then push to 100% near to when you are expected to rise. This is meant to be implemented through machine learning, and can take up to three weeks to learn. For users with a regular schedule, this might work well – although I can imagine the frustration of having to wake up early one morning and only finding the phone 80% charged.
    ASUS has included a feature called ‘AI Boost’, which has nothing to do with machine learning. From what I was able to determine from the marketing team at the pre-brief, this feature enables all the cores to run locked at their peak frequency, which has the potential to boost performance. ASUS decided showing benchmarking results of Antutu was the best strategy here, with a +50% performance jump. We performed a quick test on the Jetstream benchmark, which showed an 18% performance gain with the mode on. The fact that this tool has nothing to do with the definition of artificial intelligence or machine learning is indicative of ASUS’ smartphone marketing team new strategy of calling everything AI. This reminds me of a very specific comic strip about how people misuse the term AI.
    Prices and availability are set to be announced at the ASUS Launch Event at Mobile World Congress.




    More...

Thread Information

Users Browsing this Thread

There are currently 11 users browsing this thread. (0 members and 11 guests)

Tags for this Thread

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
Title