Thread: Anandtech News

  1. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,805
    Post Thanks / Like
    #6571

    Anandtech: GIGABYTE's New Console: The 'Gaming GT' PC Launched with Core i7-K, GTX108

    GIGABYTE has quietly announced its new small form-factor computer designed for performance-minded gamers. The new SFF PC not only packs high-performance components, such as the latest-generation CPU and GPU, but it uses desktop-class components and could be upgradeable.
    High-performance PCs in small form-factor are not rare nowadays: various PC makers use mobile CPUs and GPUs to build such systems. Such computers demonstrate high-performance out-of-the-box, but their upgrade strategy is tricky because it is not easy to get MXM GPUs or mobile CPUs in retail. When developing the GIGABYTE Gaming GT desktop (GB-GZ1DTi7-1080-OK-GW and technically this PC belongs to the Brix family), engineers from GIGABYTE wanted to create a product that could use widely available processors and graphics cards and thus potentially offer an upgrade path to the owners.
    The GIGABYTE Gaming GT desktop is not really a miniature system: it measures 276×384×128 mm and is approximately 10 liters in volume, which makes it just larger than Sony’s first-generation PlayStation 3. The PC is based on a custom motherboard (210×205 mm) that is a bit larger than Mini-ITX, but is still smaller when compared to microATX or FlexATX. The system uses a dual-chamber design (CPU, DRAM, SSD are located on one side of the PC, graphics card and other 2.5” storage devices are on the other side), but the chambers are not isolated completely because they use the same airflows generated by two system fans (we do not know how large they are, but theoretically they may be of 90 mm in diameter). It is noteworthy that the PC has an automated exhaust system that opens up exhaust flippers at the top of the computer when it needs to cool down the components.
    The chassis can accommodate any double wide graphics card with a maximum size of 280 mm in length and 41 mm in depth, but nothing too custom with regards heatpipes and backplates because the space is constrained. GIGABYTE will ship the Gaming GT desktop with its own GeForce GTX 1080 G1 Gaming 8 GB card, but they say the key thing is that it can be updated later. Since the card faces downwards, it has to use custom cables (bundled) to connect to displays. Audio and Ethernet cables also have to be angled, but at least it is not a big problem to find such cables in retail. On the bright side of things, the Gaming GT desktop uses a 400 W FlexATX PSU, which could be upgraded if needed.
    GIGABYTE Gaming GT Spefications
    GB-GZ1DTi7-1080-OK-GW
    CPU Intel Core i7-6700K
    Quad Core with HT
    4.0 GHz/4.2 GHz
    91 W
    PCH Intel Z170
    Graphics GIGABYTE GeForce GTX 1080 G1 Gaming
    8 GB GDDR5X
    2560 stream processors
    160 texture units
    64 raster operations pipelines
    Memory 32 GB of DDR4 (2×16 GB DDR4)
    Storage 240 GB SSD (PCIe/SATA?)
    1 TB 2.5" HDD (7200 RPM)
    1×2.5" bay for HDD/SSD (SATA)
    Wi-Fi Intel 1×1 Dual Band Wireless-AC 3165NGW 802.11ac + BT
    Ethernet Rivet Networks Killer E2400 Gigabit LAN
    Display Outputs 1 × DVI-D DL
    1 × HDMI 2.0b
    1 × HDMI 1.4 (uses iGPU)
    3 × DisplayPort 1.4
    Audio 5.1-channel audio
    Realtek ALC1150 codec
    TI Burr Brown OPA2134 operational amplifier
    USB 5 × USB 3.0 Type-A (5 Gbps)
    1 × USB 3.1 Type-A (10 Gbps)
    1 × Thunderbolt 3/USB 3.1 Type-C (10 Gbps)
    Other I/O -
    Dimensions 276 mm × 384 mm × 128 mm
    10.86 × 15.11 × 5 inches
    PSU FlexATX 400 W
    OS Windows 10 Home
    The custom motherboard of the GIGABYTE Gaming GT PC is based on the Intel Z170 PCH and thus supports all LGA1151 processors, including the upcoming Kaby Lake chips. The manufacturer will ship the system with the Intel Core i7-6700K CPU (so, overlocking seems to be possible, but keep in mind temperatures and noise), but eventually the chip might be switched to something more powerful.


    The GIGABYTE Gaming GT PC will come with 32 GB of dual-channel DDR4 memory, a 240 GB SSD (M.2 form-factor, but no word on performance), a 1 TB HDD with a 7200 RPM spindle speed and an additional 2.5” bay for an extra drive. For those, who are not satisfied with an M.2 SSD and two 2.5” HDDs/SSDs, GIGABYTE even installed one Thunderbolt 3 port to connect external high-performance storage devices or special-purpose hardware. When it comes to other I/O, then the Gaming GT desktop offers a dual-band 1×1 802.11ac + BT 4.2 wireless module, Gigabit Ethernet (Killer E2400), five USB 3.0 Type-A ports, one USB 3.1 Type-A (10 Gbps) port, 5.1-channel audio (the Realtek ALC1150 with the TI Burr Brown OPA2134 amplifier) and so on.
    Finally, to give its Gaming GT system a distinctive look, GIGABYTE installed a series of RGB LEDs on top of it. The LEDs can work in different modes and can be controlled using the company’s Ambient LED application.
    Pricing and availability dates for the GIGABYTE Gaming GT SFF PCs were not available at press time. Keep in mind that actual configuration of the PC will differ based on the regions, which means that their prices will vary as well. Chances are we'll see it at CES next week.
    Gallery: GIGABYTE Announces Brix Gaming GT SFF PC: Core i7-6700K, GeForce GTX 1080, TB3


    Related Reading:




    More...

  2. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,805
    Post Thanks / Like
    #6572

    Anandtech: Lenovo Launches Legion Branded Gaming Laptops

    While the PC market has been contracting somewhat in recent years, there is still one line where buyers need more and more performance, and that is gaming. Even looking at just 1920x1080 as a resolution, even as recently as last year’s GTX 980M would not always run games at that resolution at a solid 60 frames per second.
    With the launch of NVIDIA’s Pascal for notebooks, that has changed, and Lenovo is announcing today a new sub-brand dedicated to gaming. Previously, their gaming laptops were under their Ideapad lineup, which is the same as most of their consumer lineup, but with new branding of Legion, Lenovo is looking to move into this lucrative market. With a true gaming brand, they want to have the same community engagement that other companies have created around their own gaming brand, and it’s a smart idea.
    Lenovo Legion Y720
    To launch the Legion brand, Lenovo is announcing two new laptops in the Lenovo Legion Y520 and Y720. This is certainly not Lenovo’s first foray into gaming laptops, and the new models are a spiritual successor to laptops like the IdeaPad Y700 that we reviewed last year. The Y700 had a good design, but a sub-par display and a cramped display, and it’s interesting to see how quickly Lenovo has at least addressed the keyboard – the display may be better, but it will have to wait for proper testing to see.
    Lenovo Legion Y720
    Lenovo has taken a very interesting approach to fit a full keyboard plus number pad into the new Legion laptops. Where most laptops, including the Y700 from last year, end up compressing the keys into too small a space, Lenovo has shifted the number pad up, creating a nice spot for the dedicated full sized arrow keys. The Zero key is still half size, but that’s not very uncommon on a notebook. It’s a smart design, and on the Y520 it continues with the red backlighting they have previously established on the Y lineup, and they’ve also added some extra lighting around the trackpad for a nice touch. The Y720 offers an optional multi-color zoned keyboard as well. The keys feature 1.7 mm of travel as well.
    Both the Y520 and Y720 are 15.6-inch models, and the higher model number is a reflection of the higher performance parts in the new shell. Both models feature the 7th generation (Kaby Lake) Intel Core i7 processor, with the Y720 offering up to the i7-7700HQ quad-core model. Both offer two SODIMM slots, with up to 16 GB of RAM available from Lenovo, and more if you want to buy your own DDR4. They also both feature PCIe SSD based boot storage, as well as mechanical HDD options for bulk storage, and I would imagine the lowest end models may forgo the SSD option to save cost.
    The big difference is really the GPU option, with the Y520 offering up to a GeForce GTX 1050 Ti. The 1050 Ti has not been formally announced for notebooks as of the time of writing, but expect it to fall in-line with the other Pascal parts as far as the notebook performance is concerned, where it’s a bit slower than the desktop counterpart, but close enough that NVIDIA has stopped branding them separately. The Y720 offers the more powerful GTX 1060.
    Lenovo Legion Y520
    On the display side, the lower cost Y520 offers just a 1920x1080 display, but the Y720 offers a choice of either that or a 3840x2160 panel. The GTX 1060 is not going to be enough GPU for gaming at that resolution, especially with maximum settings, but it should be nice for lower demand games and office work.
    As for audio, both have 2x2W speakers, but the Y720 adds a 3W subwoofer and is the first ever Dolby Atmos PC according to Lenovo.
    On the battery side, the Y520 has just a 45 Wh battery, which is very small for a gaming notebook, and the Y720 bumps that up to 60 Wh. With the extra battery and performance, the Y720 tips the scales at 7.05 lbs, up from the 5.3 lbs of the Y520. Both models are fairly thin, with the Y720 at 1.14-inches and the Y520 at 1.01-inches, which should help with portability. The Y720 will be available in April for a starting price of $1400, and the Y520 will be offered in February for just $900 starting price, which is a pretty low entry point for a gaming notebook.
    The new Legion brand appears to be off to a good start with these new models, and we hope to get some hands-on time with them this week.
    Gallery: Lenovo Legion


    Source: Lenovo


    More...

  3. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,805
    Post Thanks / Like
    #6573

    Anandtech: Lenovo Announces The Alexa Powered Smart Assistant, And Smart Storage NAS

    When Amazon announced the Echo, it’s likely even they were not prepared for the response to the device, and Echo is the primary funnel to Amazon Alexa, which is their intelligent personal assistant, which has been so successful in the Echo as a voice based assistant. Today Lenovo is announcing they have partnered with Amazon to bring Alexa to the Lenovo Smart Assistant.
    Lenovo’s take on the voice-based IoT assistant is available in three colors, and offers eight 360° far-field microphones with noise suppression and acoustic echo cancellation. All of this is to make it able to be used at up to distances of up to 16 feet, or 5 meters, away.
    Lenovo will also be offering a Harman Kardon edition of the Smart Assistant, which will offer better quality speakers for an improved audio experience when using the assistant to playback music.
    Some may be wondering why Lenovo would want to create such a device, but the device will be able to control Lenovo smart home devices, and is a foot in the door for them to be part of this arguably large IoT market. It will also work with many existing 3rd party products as well.
    The design is typical for this type of device, with a large cylinder meant to be used upright. A small footprint is ideal since these are almost certainly going to be used on a counter in a home. Lenovo’s multiple color options, as well as the Harman Kardon audio version, bring a bit more customization than perhaps you would see otherwise. The Lenovo Smart Assistant will be available in May starting at $130, and the Harman Kardan model will be priced at $180.
    Lenovo is also announcing the Smart Storage solution, which is a NAS device meant to be used to sync between multiple devices in the home. There’s no doubt that our digital lives create an enormous amount of data, and the Smart Storage NAS will be available with up to 6 TB of space.
    The design is certainly interesting, with an upright chassis that is much more appealing to look at than most NAS on the market, and the Smart Storage smarts include the ability to use facial recognition to organize your photo library. It features dual-band wireless access, as well as Ethernet and USB 3.0 ports. Lenovo’s press material doesn’t delve too deeply into the other software features of the Smart Storage, so we’ll have to wait for some hands-on time at CES this week.
    Finally, Lenovo is also offering a new interesting take on the HTPC keyboard. One of the biggest issues with a HTPC keyboard is you want it to be wireless, portable, and easy to use, but often it would be handy to have access to a trackpad or pointer control. Lenovo's solution is the Lenovo 500 Multimedia Controller, which offers a full keyboard experience, as well as a full track pad.
    Where is the trackpad you might be asking? The entire keyboard is the trackpad. This gives a huge surface to use for pointer control, as well as Windows 10 gesture support, without adding extra space that would be necessary for its own trackpad. The trackpad defaults to 150 DPI, but can be set to up to 1000 DPI if needed. The keyboard connects over a USB dongle, which is on the 2.4 GHz spectrum, and it is powered by two AAA batteries that Lenovo says will give up to 8 months of use. It will be available in March for $54.
    Source: Lenovo



    More...

  4. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,805
    Post Thanks / Like
    #6574

    Anandtech: AMD Announces FreeSync 2: Easier & Lower Latency HDR Gaming

    Though they don’t get quite as much ongoing attention as video cards due to their slower update cadence, one of the nicer innovations in the last few years in the gaming hardware ecosystem has been variable refresh displays. By taking displays off of a fixed refresh rate and instead coupling it to the frame rate, the state of gaming on the PC has become a lot more pleasant, especially in the irksome area between 30 and 60 frames per second.
    As it was NVIDIA to make the first move here in 2013, AMD only ended up rolling out their own variable refresh solution in 2015. Under the brand name FreeSync, AMD leveraged the VESA’s optional DisplayPort Adaptive-Sync standard to offer variable refresh in conjunction with the major monitor manufacturers. The fact that AMD was second to the market didn’t dampen their enthusiasm (or customers’) too much, but it did mean that until recently they were playing catch-up with NVIDIA on extra features. AMD finally reached (practical) feature parity with NVIDIA just last month when they added support for borderless windowed mode.
    But now that AMD has caught up with NVIDIA, their attention is quickly shifting to what they need to do to get ahead and where they can go next. Which is a harder area to tackle than may at first be apparent; variable refresh is a fundamental feature, and once you have support for it, it shouldn’t require constant fiddling. The end result is that for their next monitor technology initiative, AMD is tackling more than just refresh rates. Looking to address the high-end market with a new solution for both HDR and variable refresh, today AMD is taking wraps off of this initiative: FreeSync 2.
    FreeSync 2: HDR Done Better

    Trying to explain FreeSync 2 can get a bit tricky. Unlike the original FreeSync that it takes its name from, it’s a multi-faceted technology: it’s not just variable refresh, instead it’s HDR as well. But it’s also a business/platform play in a different way than FreeSync was. And while strictly speaking it’s a superset of FreeSync, it is not meant to replace FreeSync wholesale. Perhaps the best way to think of FreeSync 2 is that it’s a second, parallel initiative that is focused on what AMD, its monitor partners, and its game development partners can do to improve the state of high-end monitors and gaming.
    In terms of features then, what is easily the cornerstone feature of Freesync 2 – and really its reason to be – is improving support for HDR gaming under Windows. As our own Brandon Chester has discussed more than once, the state of support for next-generation display technologies under Windows is mixed at best. HiDPI doesn’t work quite as well as anyone would like it to, and there isn’t a comprehensive & consistent color management solution to support monitors that offer HDR and/or color spaces wider than sRGB. The Windows 10 Anniversary Update has improved on the latter, but AMD is still not satisfied with the status quo on Windows 10 (never mind all the gamers still on Windows 7/8).
    As a result FreeSync 2 is, in part, their effort to upend the whole system and do better. For all of its strengths as a platform, this is an area where the PC is dragging compared to consoles – the PlayStation 4 was able to add functional & easy to use HDR10 support to all units as a simple software update – so for AMD they see an opportunity to improve the situation, not only making HDR support more readily available, but improving the entire experience for gamers. And to do this, AMD’s plans touch everything from the game engine to the monitor, to make HDR the experience it should be for the PC.
    Diving into the technical details then, AMD’s solution is essentially a classic one: throw out what isn’t working and make something that works better. And what isn’t working right now? As mentioned before, Windows doesn’t have a good internal HDR display pipeline, making it hard to use HDR with Windows. Meanwhile HDR monitors, though in their infancy, have their own drawbacks, particularly when it comes to input lag. The processors used in these monitors aren’t always capable of low-latency tone mapping to the monitor’s native color space, meaning using their HDR modes can add a whole lot of input lag. And worse, current HDR transports (e.g. HDR10) require tone mapping twice – once from the application to the transport, and second from the transport to the native color space – so even if a monitor has a fast processor, there’s still an extra (and AMD argues unnecessary) step in there adding input lag.
    FreeSync 2 then attempts to solve this problem by upending the whole display pipeline, to get Windows out of the way and to offload as much work from the monitor as possible. FreeSync 2 in this respect, is essentially an AMD-optimized display pipeline for HDR & wide color gamuts, in order to make HDR easier to use and better performing as well.
    The FreeSync 2 display pipeline as a result is much shorter (i.e. lower latency), and much more in AMD’s control. Rather than the current two-step process, AMD proposes to have a single step process: games tone map directly to the native color space of a FreeSync 2 compliant monitor, AMD’s drivers and hardware pass that along, and then the monitor directly accepts the display stream without further intensive processing. The end result is that latency is potentially significantly reduced by removing the second tone mapping step from the process.
    Meanwhile on the usability side, AMD’s drivers and FreeSync 2 monitors would implement a form of automatic mode switching. The idea here being that Windows in its current form really doesn’t like anything other than sRGB, so for desktop use, users are better off with their monitor in this mode. However when a FreeSync 2-compatible game is fired up, the monitor and AMD’s drivers would then switch over to the native color space automatically, and back again when going back to the Windows desktop. The ultimate idea here being to make it easier use HDR and wide color gamuts when feasible, and sRGB when not.
    Overall, this sounds like a reasonable solution to making HDR work in the short-term. AMD can’t fix Windows’ handling of HDR or wide color gamuts – you still don’t have a truly color managed environment on the Windows desktop for windowed applications – but it would be an improvement over the current situation by letting games and other applications call for something better than sRGB when they’re being used in fullscreen exclusive mode.
    However to make all of this work, AMD will need to bring together both display manufacturers and game developers, and this is likely to be the trickiest part of AMD’s plan for FreeSync 2. Under the hood, AMD makes this shortened display pipeline work by having games tone map directly to a monitor’s native space, but to do so games need to know what the specific capabilities are of the attached monitor; what color space it can render to, and over what brightness range. This isn’t something Windows’ APIs currently support, and that means AMD has to provide a FreeSync 2 API instead. And that means AMD needs to get developers on-board.
    The good news for AMD (and developers) is that the actual implementation of FreeSync 2 should be quite simple since most games are already rendering in HDR and tone mapping to at least SDR to begin with. Game developers only need to query for the API, tone map to the specifications AMD provides, and then from there it’s AMD and the monitor’s problem. But counting on developers to do anything extra for PC games is always a risk, one that has hurt initiatives in the past. For their part, AMD will be doing what they can: focus on the upstream engines and developer relations/evangelism. By getting FreeSync 2 support added to major engines like Unreal Engine and Unity, AMD makes it much easier for downstream developers to adopt FreeSync 2. Beyond that, it’s about convincing developers that supporting FreeSync 2 will be worth their while, both in terms of sales and improving the customer experience.
    On the flip side of the coin, getting monitor manufacturers on-board should be relatively easy. AMD’s original FreeSync effort was extremely successful here (to the tune of 121 FreeSync monitors), in part because AMD made it such an easy feature to add, and they are aiming for something similar with FreeSync 2. It doesn’t sound like display controllers need to be substantially altered to support FreeSync 2 – they just need to have a tone mapping bypass mode and understand requests to switch modes – which would make it easy for the monitor manufacturers to add support. And for their part, the monitor manufacturers like features like FreeSync because they can be easily implemented as value add features that allow a monitor to be sold for a higher price tag.
    On a final note, while the FreeSync 2 initiative as-planned requires game developers to buy into the ecosystem by supporting the related API, I did take a moment to ask AMD about whether they could do anything to better support games that might offer HDR support but not use AMD’s API. The answer, unsurprisingly, was “no comment”, but I got the distinct impression that it’s a question AMD has considered before. Without direct API support there’s still a need to do tone-mapping twice, and that would negate some of the latency benefits, but AMD could still potentially do it a lot faster than the display processors in some monitors. If AMD were to struggle with developer adoption, then that alone could still make FreeSync 2 worth it.
    FreeSync 2: Tighter Standards for Variable Refresh

    Earlier I mentioned that FreeSync 2 is really a collection of several idea/features, and while HDR is certainly the marquee feature of FreeSync 2, it’s not the only feature. With FreeSync 2 AMD will also be tightening the standards for what variable refresh functionality that approved monitors need to support.
    The open nature of FreeSync has led to a large number of monitors that support the technology across a wide range of prices, but it has also led to a wide variety in how useful their FreeSync implementations are. A number of basic monitors on the market only support a range of 30Hz to 60Hz, for example. And while this is still useful, such a narrow range means that these monitors don’t deliver a very good experience below their minimum refresh rate. These monitors can’t support FreeSync’s Low Framerate Compensation (LFC) technology, which requires the maximum framerate to be at least 2.5x the minimum framerate (or 75Hz for our 30Hz monitor).
    As a result, AMD has tightened the standards for FreeSync 2. All FreeSync 2 certified monitors will be required to support LFC, which in turn means they’ll need to support a wide enough range of refresh rates to meet the technology’s requirements. Consequently, anyone who buys a FreeSync 2 monitor will be guaranteed to get the best variable refresh experience on an AMD setup, as opposed to the less consistent presence of LFC on today’s FreeSync monitors.
    Similar to this and AMD’s HDR efforts with FreeSync 2, AMD will also be mandating a general low latency requirement for the new standard. It’s not fully clear just what this will entail, but at a high-level AMD is going to require that monitors be low latency in SDR mode as well as HDR.
    FreeSync 2: A Focus on the High-End

    The final shift in FreeSync 2 – and really what makes it a parallel effort as opposed to a replacement for FreeSync 1 – is how AMD will be approaching the market. The costs of meeting the HDR and variable refresh requirements for FreeSync 2 means that this is very much a play at the high-end monitor market. Budget monitors won’t be able to meet these requirements (at least not right away), so AMD’s focus is going to be on the high-end of the market.
    The significance, besides the parallel standards, is that it will impact how AMD goes about certifying monitors, and potentially how “free” FreeSync 2 ends up being. The additional requirements mean that AMD will need to run a more complex certification program. They will need to bring in monitors to profile their native color space and confirm they meet the latency & refresh requirements. All of which cost time and money for AMD.
    As a result, when questioned on the matter, AMD is not currently commenting on the subject of FreeSync 2 royalties. Presumably, AMD is pondering the idea of charging royalties on FreeSync 2 hardware.
    The subject of royalties on gaming hardware is not a very happy subject, nor is it one that too many companies like to talk about. NVIDIA for their part does charge manufacturers a form of royalties on their G-Sync technology – this being part of the impetus for AMD calling their variable refresh implementation FreeSync – and while no one will go on record to confirm the numbers, what rumblings I’ve heard is that G-Sync is “not cheap.” But numbers aside, at the end of the day this makes variable refresh a value add feature for NVIDIA just as much as it does their monitor manufacturer partners, as they profit from the sale of G-Sync monitors. At the same time it also means that the ongoing development of G-Sync is self-sustaining, as the program can now be funded from G-Sync royalties.
    There are a number of assumptions in here, but ultimately the fact that AMD isn’t immediately rejecting the idea of royalties could prove to be a very important one. Royalties at a minimum would help fund the certification program, and taken to the same extent as NVIDIA could become another revenue stream entirely. And since FreeSync 2 is aimed at high-end monitors, it would allow AMD to claim a piece of the pie on their own value add feature, as high-end monitors can fetch a significant profit of their own. Negatively however, it would also likely push FreeSync 2 monitor prices up, making them less affordable.
    At any rate, while AMD is pondering royalties on FreeSync 2, they won’t be giving up on the free-as-in-speech aspects of FreeSync 2. AMD tells us they will still be pushing for technological openness so that everyone can see how FreeSync 2 works, even if ultimately AMD decides to charge monitor manufacturers to make it work with their video cards. Ultimately, where exactly we’ll end up remains to be seen, as AMD is very much still in the early stages of planning with FreeSync 2.
    Hardware Compatibility & First Thoughts

    Wrapping things up, now that we’ve covered the proposed feature set of FreeSync 2, let’s talk about hardware compatibility. AMD has repeatedly touted the flexibility of their more recent display controllers, and this is once again going to be the case when it comes to FreeSync 2. Because all of AMD’s FreeSync 1-capable cards (e.g. GCN 1.1 and later) already support both HDR and variable refresh, FreeSync 2 will also work on those cards. All GPUs that support FreeSync 1 will be able to support FreeSync 2. All it will take is a driver update.
    Admittedly I don’t see too many Radeon HD 7790 or R9 290X owners shelling out for what will likely be an expensive generation of HDR monitors, but it’s nonetheless neat to see that AMD will be able to bring that tech to older cards. More practically speaking, this means that recent buyers of the RX 480 and other Polaris cards won’t be left out in the cold once FreeSync 2 arrives.
    And when will FreeSync 2 arrive? The answer to that is a bit less clear. AMD is not setting any hard dates and is not announcing any monitors today. They are only announcing the start of the FreeSync 2 initiative. They still need to finish writing the necessary driver code and bring on both hardware and software partners.
    The nature of their presentation makes it sound like FreeSync 2 is something that should arrive this year. Certainly the timing is right given the impending launch of HDR-capable PC monitors. But as FreeSync 2 relies on a number of external factors, I suspect AMD wants to avoid making promises they can’t deliver on alone.
    In the meantime AMD’s initiative will definitely bear keeping an eye on. AMD is pushing the right buttons with their plan to improve the state of HDR gaming on the PC. If they and their partners can deliver on what they propose, then it would mean that HDR gaming on the PC would shine far more brightly than it otherwise would.
    Gallery: AMD FreeSync 2 Presentation




    More...

  5. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,805
    Post Thanks / Like
    #6575

    Anandtech: Intel Launches 7th Generation Kaby Lake: 15W/28W with Iris, 35-91W Desktop

    The death of Intel’s ‘Tick-Tock’ means that Kaby Lake is Intel’s third crack at their 14nm process. 14nm started with Broadwell (5th Gen, tick), introduced a new microarchitecture with Skylake (6th Gen, tock), and now is in the ‘optimization’ stage with Kaby Lake (7th Gen). This means an improved ‘14nm Plus’, offering better power efficiency and higher frequencies through a less strained transistor floorplan. Intel is launching a myriad of SKUs under Kaby Lake, ranging from mobile KBL-U at 15W and 28W through mobile KBL-H at 45W and desktop-class KBL-S at 35W to 91W. This includes three overclocking SKUs for desktop, including an i3 variant. Here’s the front page of AnandTech’s Kaby Lake launch coverage.

    More...

  6. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,805
    Post Thanks / Like
    #6576

    Anandtech: The Intel Core i7-7700K (91W) Review: The New Out-of-the-box Performance C

    The Core i7-7700K, launched today, is Intel's fastest ever consumer grade processor. Using Intel’s third set of processors at 14nm, using the new 14+ variant, we get processors with a better frequency-voltage curve that translates into more performance, better efficiency, and the potential to push the silicon further and harder. Here is our review.


    More...

  7. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,805
    Post Thanks / Like
    #6577

    Anandtech: Qualcomm CES 2017 Liveblog


  8. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,805
    Post Thanks / Like
    #6578

    Anandtech: CES 2017 Honor Double Or Nothing Live Blog

    We're here at the Honor Double or Nothing event at CES 2017!


    More...

  9. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,805
    Post Thanks / Like
    #6579

    Anandtech: Qualcomm Details Snapdragon 835: Kryo 280 CPU, Adreno 540 GPU, X16 LTE

    Qualcomm’s Snapdragon 835 is the first mobile SoC to use Samsung’s new 10nm FinFET process. It includes a number of updates, including a revamped CPU configuration, that promise to deliver better performance and power efficiency relative to the Snapdragon 820. With its focus on heterogeneous computing, the Snapdragon 835 brings advanced capabilities to virtual reality, photo and video capture, video playback, and machine learning.

    More...

  10. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,805
    Post Thanks / Like
    #6580

    Anandtech: Linksys Enters Mesh Wi-Fi Market with Velop Whole Home Wi-Fi, Expands MAX-

    Linksys has updates in three different product lines as part of CES 2017. The new product line is the Velop Whole Home W-Fi mesh networking kit. Linksys is definitely late to the mesh party, but, the delay has enabled them to put in the most important set of features present in the currently available solutions.
    In the current market, Netgear's Orbi is undoubtedly one of the leading 'mesh' solutions. By placing a dedicated backhaul radio (4x4 802.11ac), and restricting the arrangement of the primary router and satellites to a star topology, Orbi provided excellent performance numbers. The adoption of Qualcomm Atheros's Wi-Fi SON firmware features also enabled a good user experience.
    Linksys adopts a similar platform on the software side as well as high-level operation (one channel for backhaul and another for client communication). On the hardware side, the main SoC is still the Qualcomm Atheros IPQ 4019. However, instead of having a 4x4 802.11ac backhaul radio using the QCA9984 like the Orbi, Linksys has opted to use the more economical 2x2 802.11ac QCA9886 radio. This makes the solution an AC2200-class one (we are seeing many routers in this class being launched at CES 2017). Like the Netgear Orbi, the Linksys Velop also supports beamforming, Wave 2 MU-MIMO capabilities, and can advertise itself as a 'tri-band' router.
    Interestingly, the members in a Velop installation can have either wired or wireless backhaul. The wireless backhaul can be dynamically chosen based on the available communication channels (either of the 5 GHz bands or the 2.4 GHz band). This allows configuration in multiple mesh modes - point-to-point, real mesh, star, line, or tree. The self-healing aspects of Wi-Fi SON enable the most suitable path for communication in the case of a broken connection.
    In order to enable better performance with suitable antenna placement, the Velop units, like the Orbi, adopt a tower design. The industrial design, like that of most other mesh Wi-Fi kits, is attractive enough to not stow away in a closet (something that is very important for a good mesh Wi-Fi experience). Linksys is making the Velop available in 1-, 2- and 3-packs, starting today. The three kits are priced at $200, $350, and $500.
    Gallery: Linksys Velop Mesh Wi-Fi Node


    Given the hardware configuration and radio details, it looks like the Netgear Orbi might still take the performance crown for scenarios requiring around 2000 - 4000 sq. ft. of coverage. However, Velop brings across some very interesting features to the mesh market. In terms of ease of use, the product follows a mobile-first setup and usage process, as well as Amazon Alexa integration - features that are apt for the target market. The technical transparency also provides us with enough insight into the scenarios and use-cases in which it might be an effective option. Particularly, if a wired backhaul is possible, the Velop could turn out to be a very good candidate to extend Wi-Fi reach. Additional insights into the real-world performance of the Velop kits is definitely something to look out for in the near future.
    In other Linksys CES 2017 news, the MAXSTREAM lineup has gained some additional members - the Max-Stream AC2200 Tri-Band MU-MIMO Gigabit Router (EA8300) is the single router version of the Velop platform. The same SoC and radios are used to provide two 5 GHz SSIDs and one 2.4 GHz SSID. The router comes with 256MB of DDR3 RAM and 256MB of flash memory. Availability is slated for Spring 2017, with a MSRP of $200.
    The Linksys EA8300
    The Max-Stream AC4000 Tri-Band MU-MIMO Gigabit Router (EA9300) is one of the first routers based on the next-generation Broadcom network processor with a 1.8 GHz ARM v8 quad-core CPU, the BCM4908. It supports the Broadcom XStream configuration (two 5 GHz bands and one 2.4 GHz band) with Wave 2 MU-MIMO capabilities. The radio used in the EA9300 is the Broadcom BCM4365E. This appears to be a 3x3 variant of the BCM4366. Note that this radio supports the non-standard 1024 QAM for a 25% higher throughput number (1625 Mbps in each 5 GHz band for a 3x3 configuration, compared to 1300 Mbps for the standard 3x3 configuration with 80 MHz channels). The 2.4 GHz band also supports up to 750 Mbps with this proprietary scheme, allowing for the router to be advertised as AC4000 (1625 + 1625 + 750 Mbps). Availability is slated for Spring 2017, with a MSRP of $300.
    The Linksys EA9300
    Both the EA8300 and the EA9300 work with Amazon Alexa for a voice-activated experience. Linksys has put extra focus in the Android / iOS apps for a mobile-first setup and management process. They are also designed to provide a seamless roaming experience with a single SSID and easy connection to the Max-Stream lineup of range extenders.
    Linksys is also bringing a Wave 2 MU-MIMO 2x2 USB 3.0 WLAN adapter to the market with the WUSB6400M. It is also slated to come to the market in Spring 2017. The MSRP will be $60. Additional MU-MIMO client nodes can help consumers take full advantage of their MU-MIMO routers. These types of USB adapters are important for the MU-MIMO ecosystem.
    On the cable modem side, Linksys is also planning to launch a DOCSIS 3.1 version - the CM3132 with 32 downstream and 8 upstream channels. In terms of the core platform, it is similar to Netgear's CM1000, which is already in the market. However, the CM3132 differentiates itself with the availability of two Ethernet ports (unlike the single one in the Netgear CM1000). Availability is slated for Spring 2017, and the MSRP will be $200.


    More...

Thread Information

Users Browsing this Thread

There are currently 28 users browsing this thread. (0 members and 28 guests)

Tags for this Thread

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
Title