Thread: Anandtech News

  1. Registered TeamPlayer dex71's Avatar
    Join Date
    12-28-07
    Location
    Gopher/Viking Country
    Posts
    17,455
    Post Thanks / Like
    Stat Links

    Anandtech News Anandtech News Anandtech News Anandtech News Anandtech News Anandtech News
    Gamer IDs

    Steam ID: dex71
    #6661

    Re: Anandtech News

    Quote Originally Posted by Kanati View Post
    ZOMG WE'RE ON PAGE 666 ON TRUMP'S INAUGURATION DAY!!! IS THIS A SIGN!?!?
    Technically, we hit page 666 on Barry's watch.

    Fucking Obama.

  2. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,809
    Post Thanks / Like
    #6662

    Anandtech: Samsung Reveals Root Cause of Galaxy Note7 Battery Fires

    Samsung held a press conference today detailing the results of its investigation into the battery fires that plagued its Galaxy Note7. Reports of phones catching fire appeared soon after it went on sale August 19, 2016. After an initial investigation identified faulty batteries from one of its suppliers as the root cause, later identified by the US Consumer Product Safety Commission as Samsung SDI, Samsung announced a global exchange program on September 2, where phones containing the faulty battery would be replaced by ones using a battery from a second supplier; however, after this new batch of phones continued to catch fire, it became clear that the Note7’s battery problem was more complicated. Without a solution in hand, the company was forced to cease production and suspend sales of the Note7 on October 11. To date, Samsung said 96% of the roughly 3 million phones sold to customers around the world have been recovered.
    Over the past several months, Samsung has been working to uncover the root cause of the Note7’s battery fires. The company’s own internal investigation involved over 700 engineers and data gathered from testing 200,000 phones and 30,000 Note7 batteries. To facilitate the testing, Samsung built a large-scale test facility that automated various charging and discharging scenarios that ultimately replicated the failures that occurred in the field.
    To ensure it found the true cause or causes of the failures, it examined everything from the battery to hardware to software to manufacturing and logistics. Samsung tested both wired and wireless charging over a range of voltages and currents, with the Note7’s rapid charging feature both enabled and disabled. To see if the phone’s IP68 rated environmental protection caused heat buildup inside the sealed chassis, it tested phones with and without the back cover in place. It even checked to see if the Note7’s iris scanning feature or running certain preloaded and third-party apps resulted in excessive battery current.
    In addition to its own internal testing, Samsung also enlisted the aid of two independent testing labs—Exponent Engineering and Scientific Consulting and UL—for battery and electronics testing and hired TÜV Rheinland to investigate possible causes due to logistics and assembly.
    Ultimately, these investigations revealed that the Note7 fires were caused by different design and manufacturing defects in the batteries from both suppliers. The lithium-ion batteries used in the Note7 were constructed from long strips of an insulating separator sandwiched between positive and negative electrode foils spirally wound to create an electrode assembly inside a heat-sealed polymer pouch. Electrode tabs were welded to the positive and negative foil electrodes for external connection points.
    For the batteries from Samsung SDI (referred to as “Company A”), the polymer pouch did not provide enough room for the negative electrodes in the upper-right corner closest to the negative tab. Either through normal expansion and contraction of the battery while charging/discharging, or through damage caused during manufacturing or assembly (as discovered by UL), the negative electrodes were bent after contacting the pouch corner, stretching the separator. If the separator failed, the cell would short circuit, which could lead to a runaway thermal failure. UL also discovered that the separator was too thin in some samples, which would increase the likelihood of this type of failure.
    The pouch design of the batteries supplied by “Company B” (Amperex Technology), which were used in the Note7 replacements, had sufficient clearance around the electrodes. The failure of these batteries was instead due to poor control of the ultrasonic welding process used to attach the positive tab to the positive electrode. This spot welding process created sharp protrusions that could bridge the gap between the positive tab and the neighboring negative electrode during normal expansion and contraction of the battery due to thermal cycling, causing an electrical short. Some samples were also found to be missing a layer of protective insulation tape under the positive tab (the separator material was still in place), increasing the likelihood of a short circuit.
    For batteries from both companies, the use of high-energy density cells increased the risk for thermal runaway during a short circuit, especially when the battery was in a high state of charge.
    UL and Exponent also tested the Note7’s internal charging circuitry along with external charging accessories, including Samsung’s charger and popular third-party chargers. UL found that the Note7’s battery temperature and maximum current drain did not exceed rated limits and that battery current, voltage, and temperature did not exceed the battery’s specifications when charging with Samsung’s included charger. Exponent also found the Note7’s internal battery protection circuitry capable of defending it against chargers operating outside specifications. Based on their investigation, the companies concluded that the Note7’s electronics did not contribute to failures of either manufacturer's batteries.
    To reduce the likelihood of future battery failures, Samsung is implementing an 8-point Battery Safety Check that involves additional inspection and testing, and improving training for everyone involved in handling batteries, both during assembly and shipping. It’s also adding more space and brackets around the battery to protect it from crush related failures, improving the safety standards for the materials inside the battery, and improving its software algorithms that control the battery charging temperature, current, and duration. Samsung confirmed that the upcoming Galaxy S8 will benefit from these new procedures and features, noting that they will not significantly impact the S8’s release date, which will occur sometime after Mobile World Congress this year.
    Samsung is eager to move past the Galaxy Note7 and begin the process of regaining consumer trust, which is one reason why it’s sharing the results of its investigation. In a further bid to improve battery safety throughout the industry and repair its reputation, Samsung said it will share its new 8-point inspection plan with global standards organizations.


    More...

  3. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,809
    Post Thanks / Like
    #6663

    Anandtech: Xiaomi International VP Hugo Barra to Leave the Company, Return to Silicon

    In a surprising turn of events this evening, Hugo Barra, Xiaomi’s International Vice President, has announced that he will be leaving the company next month.
    Barra has for all practical purposes served as the face of the company outside of its home region of China, having served as the VP of Xiaomi’s business affairs outside of mainland China for the last three and a half years. Brought on board by the company to oversee their expansion out of China, under Barra’s watch, Xiaomi has been able to expand into the United States, India, and other important markets for the smartphone manufacturer. Though it’s an expansion that hasn’t been without its difficulties, as just earlier this month Xiaomi’s CEO admitted that the company has been expanding too fast, and in an unsustainable manner.
    In his announcement, Barra has said that while he’s enjoyed the challenge of the job and the opportunities it provided, the pressure and hours had been catching up to his health. So with Xiaomi’s big expansion push seemingly done for now, Barra will be stepping down from the company in February – after the end of the Chinese New Year – reducing his role to an external advisor and eventually returning to Silicon Valley. While he hasn’t talked about what he’ll be doing next for employment, Barra was a member (and later director) of the Android product management team for several years before being tapped by Xiaomi, so he has ample experience in the mobile industry.
    Finally, alongside Barra’s announcement that he’ll be stepping down, Xiaomi is similarly announcing who will replace him. This time they will be tapping internal talent, with current Senior VP Xiang Wang taking on Barra’s duties.



    More...

  4. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,809
    Post Thanks / Like
    #6664

    Anandtech: Edge Memory Announces Video Speed Class SD Cards`

    As part of our CES 2017 vendor visits, we talked to Mushkin and Edge Memory about their upcoming products. For those unaware, both Mushkin and Edge Memory are brands of Avant Technology. Both brands share a number of products, with Edge Memory focusing more on the enterprise and SMB / SME market segment. Mushkin's CES 2017 announcements have already been covered last week. The external direct-attached storage products are being currently prepared for sale under the Edge Memory brand.
    The most interesting of all the new products are the VSC SD cards (Video Speed Class). While the SD Association recently announced the A1 application-performance class for use in general computing environments, the Video Speed Class rating is meant for cards used in video recording environments - including, but not restricted to, products like action cameras, dashcams, IP cameras, and the like. The key here is that the card must be able to guarantee a minimum sustained write speed to qualify for one of the VSC classes.
    Image Courtesy: SD Association
    In Spring 2017, Edge Memory plans to launch a suite of SDXC (16-256 GB) and micro-SDXC (16 - 128 GB) cards with operating temperature range between -25C and 85C. This makes them suitable for industrial use-cases also.
    The cards are expected to come in a range of VSC ratings. Claimed performance numbers indicate 225 MBps reads and 195 MBps writes. However, what really matters is the VSC rating that reflects the sustained write speeds that decide the suitability of these cards for a particular application.
    Edge Memory also announced a number of other products - a bus-powered USB 3.0 enclosure for M.2 SATA SSDs (using ASMedia ASM1153E for the micro-B version, and the ASM1351/1542 for the Type-C version), a USB OTG-compatible flash drive with both Type-C and Type-A interfaces, and new members in the diskGO Secure Ultra USB Storage lineup which has a keypad / physical PIN support to enable hardware encryption of the contents.
    Gallery: Edge Memory Direct-Attached Storage Products at CES 2017


    Avant Technologies seems to be sharing the required R&D for flash-based product lines between Mushkin and Edge Memory. In bringing the products to the market, it is clear that Mushkin is targeting the average consumer / gamer, while Edge Memory targets business and industrial use-cases. The DAS ineup from Edge Memory for the first half of 2017 appears to tap into that market quite well.
    Gallery: Edge Memory Announces Video Speed Class SD Cards`




    More...

  5. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,809
    Post Thanks / Like
    #6665

    Anandtech: SanDisk Launches A1-Class microSDXC Card

    At CES 2017, we had the chance to catch up with Western Digital to check out the updates from their end in the direct-attached storage space. After the acquisition of SanDisk, a consolidation of sorts led us to interacting with G-Technology (from the HGST side) also at the same suite. While G-Technology indicated that they have some announcements lined up for the NAB show in April, SanDisk did launch a couple of interesting products.
    The minor product launch was the 256GB version of their Extreme PRO USB flash drive. The high-capacity 'SSD on a flash drive' boasts read and write speeds of 420 MBps and 380 MBps respectively. These are obviously peak numbers. SanDisk wouldn't confirm whether they are MLC or TLC drives, but, they did indicate that the drive does NOT have TRIM support.
    The key here is the form factor, with the 256GB version being one of the smallest drives we have seen in that size-class. The drive will be available on Amazon early next month for $130. The CES press release mentioned a MSRP of $180, but, even considering the current flash shortage in the market, it would have been a bit too high for what is likely to be a TLC flash drive with a SLC cache in front.
    The more interesting product was the 256GB SanDisk Ultra microSDXC UHS-I card. It is compliant with the recently-introduced A1 application-class which mandates minimum read and write IOPS. Beyond the usual numbers (reads of up to 95 MBps), the demonstration of the benefits offered by the A1-class card was more impressive. In particular, storage-bound scenarios like game loading times showed a marked improvement.
    The card is currently available for $200 on Amazon, a steep price in terms of $/GB, but, something to be expected for this form factor.
    Gallery: SanDisk Launches A1-Class microSDXC Card




    More...

  6. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,809
    Post Thanks / Like
    #6666

    Anandtech: Kaby Lake E3-1200 v6 Xeon Specifications Leaked

    Some sleuthing by CPU-World has uncovered the list of to-be-released Kaby Lake single-socket quad-core Xeons. As it to be expected, these are incremental updates from Skylake-based Xeons using the newer 14nm Plus node from Intel. In our consumer Kaby Lake reviews, our results showed that the new design offers a better voltage/frequency profile than previous generations, affording more frequency at the same voltage. Another big change from the previous generation is the TDP: what used to be 80W is now listed as 73W if it has integrated graphics, or 72W if it does not.
    The list from CPU-World, which comes via QVL (qualified vendor list, or ‘CPUs which we confirm work in this board’) postings from motherboards such as MSI’s C236M Workstation, features eight processors. The two at the bottom of the stack are quad core parts without hyperthreading, and the others do have hyperthreading. The main differences between the processors will be frequencies and the presence of integrated graphics.
    Intel E3-1200 v6 CPUs (Kaby Lake)
    C/T Base Freq L3 Cache IGP IGP Freq TDP
    E3-1280 v6 4/8 3.9 GHz 8 MB - - 72 W
    E3-1275 v6 4/8 3.8 GHz 8 MB P630 1150 MHz 73 W
    E3-1270 v6 4/8 3.8 GHz 8 MB - - 72 W
    E3-1245 v6 4/8 3.7 GHz 8 MB P630 1150 MHz 73 W
    E3-1240 v6 4/8 3.7 GHz 8 MB - - 72 W
    E3-1230 v6 4/8 3.5 GHz 8 MB - - 72 W
    E3-1225 v6 4/4 3.3 GHz 8 MB P630 1150 MHz 73 W
    E3-1220 v6 4/4 3.0 GHz 8 MB - - 72 W
    Most of these numbers come direct from the motherboard validation lists, with some such as core count being derived from L2 cache listings. All the parts listed have a full 8MB of L3 cache, indicating they run closer to the Core i7 design rather than a Core i5 (even those that have hyperthreading disabled).
    On the integrated graphics models, i.e. those ending in '5', are all running Intel HD P630 graphics and run up to 1150 MHz. This is the ‘professional’ version of the HD630 we see on the consumer parts, using Intel’s latest Gen9 graphics architecture and supporting H.265 encode/decode. Our Kaby Lake review piece goes into more detail.
    Not listed are the turbo frequencies of the CPUs, as these are currently unknown. Neither is the pricing, however given previous launches we would expect the tray price (OEM batches of a thousand CPUs) to have parity compared to previous generations.
    Intel E3-1200 v6 and v5 CPUs
    IGP v6 Model v5 IGP
    - 3.9, 72W E3-1280 3.7/4.0, 80W -
    + 3.8, 73W E3-1275 3.6/4.0, 80W +
    - 3.8, 72W E3-1270 3.6/4.0, 80W -
    - - E3-1260L 2.9/3.9, 45W -
    + 3.7, 73W E3-1245 3.5/3.9, 80W +
    - 3.7, 72W E3-1240 3.5/3.9, 80W -
    - - E3-1240L 2.1/3.2, 25W -
    - - E3-1235L 2.0/3.0, 25W +
    - 3.5, 72W E3-1230 3.4/3.8, 80W -
    + 3.3, 73W E3-1225 3.3/3.7, 80W +
    - 3.0, 72W E3-1220 3.0/3.5, 80W -
    For the most part, the new processors are ~200 MHz faster than the v5 parts while still being rated at the lower TDP. Memory support is expected to be the same as the consumer parts (DDR4-2400), and it is not yet confirmed if the v6 processors will support Transactional Synchronization Extensions (TSX) given issues in previous revisions, so we will wait on future Intel announcements on this front.
    It is still worth noting that for LGA1151 based Xeons, Intel adjusted the requirements such that Xeon processors require a server grade chipset on the motherboard. For Skylake E3 v5 parts, this was either a C232 or C236 chipset – we reviewed a few motherboards with these on (ASRock E3V5 Gaming, GIGABYTE Z170X-Extreme ECC). With a BIOS update, these C232/C236 motherboards should support the new v6 processors. However, we currently do not know if there will be a second generation of chipsets for these CPUs in line with the consumer updates. On the consumer side the new chipset has additional PCIe lanes and Optane Memory support, so we stand in wait for a new desktop chipset to support these. There is a new mobile chipset, CM238, for mobile E3 v6 Xeons, but no equivalent in the desktop space yet.
    We currently have all the Skylake E3 v5 Xeons in for testing on our new benchmark suite soon, and we’ll make similar moves to acquire the Kaby Lake E3 v6 models when they are released. Currently there is no word on release date or pricing, however we typically see the E3 Xeons release very shortly after the consumer processor release.
    Source: CPU-World


    More...

  7. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,809
    Post Thanks / Like
    #6667

    Anandtech: Xiaomi Announces the Redmi Note 4 in India

    Just before Hugo Barra announced he was leaving the company, he and Xiaomi on Thursday introduced its latest phablet, the Redmi Note 4, in India. The design of the smartphone resembles that of the Chinese version, but the internal architecture has changed. Instead of the MediaTek Helio X20 SoC, the Indian version of the Redmi Note 4 uses Qualcomm’s Snapdragon 625, which is a shift from a tri-cluster 10-core A72/A53/A53 design to a full A53/A53 eight-core SoC. Meanwhile, despite the hardware switch, the concept of the device has not changed: the high-end flagship phone will retail at price-points below $200.
    Hugo Barra, former chief marketing officer of Xiaomi, published specs and prices of the Redmi Note 4 phablet that will be available in India. The company has changed a lot about the device both inside and outside, possibly because the newcomer will eventually be available globally and thus will have different competitors than it does in China. In general, the new Xiaomi Redmi Note 4 remained the same: it is a 5.5” smartphone that comes in a metallic unibody chassis with rounded edges and antenna separated from the rest of the back cover using polycarbonate strips. The new Redmi Note 4 lineup will include matte black, matte gold and matte gray smartphones.
    When Xiaomi introduced the Redmi Note 4 in China several months ago, the company used Mediatek's deca-core Helio X20 in a market where core-count can matter at this price-point (as in, it affects buying decisions). For the new version, Xiaomi uses a Qualcomm Snapdragon 625 SoC that features fewer general-purpose cores, has a single-channel memory controller and a modem with more bands. More importantly, the chip is made using Samsung’s 14LPP (14 nm FinFET, low-power plus) manufacturing technology and presumably has generally lower power consumption when compared to the Helio X20 made using TSMC’s CLN20SOC (20nm planar) fabrication process. In any case, Xiaomi says that the Redmi Note 4 with its 4100 mAh battery lasts 25% longer when compared to its predecessor (the Redmi Note 3 with a 4050 mAh battery), an indicator that the new unit uses components with lower power consumption.
    On the other hand, the Xiaomi Redmi Note 3 is powered by the Snapdragon 650 that features two high-performance ARM Cortex-A72 cores as well as four low-power ARM Cortex-A53 cores, compared to the quad A53+quad A53 of the Snapdragon 625 in the Redmi Note 4. It will be very interesting to see whether four A53 cores can replace two A72 cores without performance degradation (versus the predecessor) in demanding applications. However it does highlight an important trend we're seeing in the industry.
    In the last generation of mid-range smartphones, a number of companies were happy to take a 'hex' core design: big.Little using dual A72 and quad A53. This allowed the SoC to offer good peak performance, using some of the highest performing cores available at a high frequency, and move to the small cluster when in power saving mode. However, these designs were on 28nm - a popular but not leading edge process node. So far this year we've seen a number of devices announced that are ditching the pair of A72 cores for another set of quad A53 cores, on SoCs built on a 14nm node. The performance of the cores doesn't change with process node, but the power consumption does: using a 14nm S625 over a 28nm S650 means that battery life is up and up (on all else being comparable) however peak performance is generally down. The interesting intersection is if they compute the same amount of work and how much power is required: it is generally considered that a 14nm S625 still wins that one as well. This is despite the fact that the chip probably costs more, by virtue of the 14nm process. It would seem that vendors are willing to take the hit on price and performance in exchange for battery life (other devices announced include the honor 6X, Huawei Nova/Plus and the ASUS Zenfone 3 Zoom). Another downside of these S625 devices seems to be that some don't support 802.11ac.
    Xiaomi
    Redmi Note 4
    2 GB/32 GB version
    Xiaomi
    Redmi Note 4
    3 GB/32 GB version
    Xiaomi
    Redmi Note 4
    4 GB/64 GB version
    SoC Qualcomm Snapdragon 625
    8 × ARM Cortex-A53 at 2 GHz
    Adreno 506 at 624 MHz
    RAM 2 GB LPDDR3 3 GB LPDDR3 4 GB LPDDR3
    Storage 32 GB + microSD 32 GB + micromSD 64 GB + microSD
    Display 5.5" 1920x1080 (403 ppi)
    Network 4G: LTE FDD, LTE TDD
    3G: WCDMA (DB-DCHSDPA, DC-HSUPA),
    TD-SCDMA, EV-DO, CDMA
    2G: GSM/EDGE

    NB! Based on the S625 features.
    Actual capabilities may be different.
    LTE Down: 300 Mb/s
    Up: 150 Mb/s
    Fingerprint Yes
    Audio Hexagon 546 DSP, integrated speakers, 3.5-mm TRRS connector
    Dimensions unknown
    Weight ~175 grams
    Rear Camera 13 MP, dual LED flash f/2.0 aperture
    Front Camera 5 MP, f/2.0
    Battery 4100 mAh
    OS Google Android 7 with MIUI 8
    Connectivity 802.11 b/g/n Wi-Fi, Bluetooth 4.1, Micro-USB 2.0
    Navigation GPS + GLONASS
    SIM Size Nano SIM + micro SD/Dual Nano SIM
    Colors Black, Gold, Grey
    Launch Countries India
    Price Rs. 9,999
    $146
    Rs. 10,999
    $161
    Rs. 12,999
    $190
    The Redmi Note 4 phone has a 5.5-inch FHD IPS display covered with 2.5D Gorilla Glass for protection. The Chinese version of the Redmi Note 4 claimed to have a maximum brightness of 450 nits, a contrast ratio of 1000:1, 72% NTSC color gamut as well as a special technology that improves visibility of the display outdoors, but we do not know whether the Indian version has the very same display panel too.
    As for imaging capabilities, the Xiaomi Redmi Note 4 uses a 13 MP sensor with f/2.0 aperture, PDAF and a dual LED flash on the back as well as a 5 MP sensor with f/2.0 aperture on the front. Audio features of the Xiaomi RN4 include built-in speakers as well as 3.5-mm TRRS audio jack on top. Meanwhile, for local connectivity, the phone features 802.11n Wi-Fi, Bluetooth 4.1 and a microUSB port. Now, while we understand that the Snapdragon 625 supports LTE, WCDMA, CDMA and GSM, but so far, Xiaomi has not announced specific bands for the Redmi Note 4 smartphone. In the best-case scenario, the handset supports everything the SoC does, but the manufacturer has not confirmed that yet.
    The Xiaomi Redmi Note 4 uses Google’s Android 7 with various enhancements by Xiaomi, including new security features of the MIUI 8 designed to simplify usage of the fingerprint scanner.
    The Xiaomi Redmi Note 4 was up for sale in India as of the 23rd January. For the prices, they will vary from Rs. 9,999 ($146) for the entry-level 2 GB/32 GB model to Rs. 12,999 ($190) for the high-end 4 GB/64 GB SKU.
    Gallery: Xiami Announces Redmi Note 4 in India: 5.5 Inch, Snapdragon 625, 4 GB/64 GB, 4100 mAh


    Related Reading:




    More...

  8. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,809
    Post Thanks / Like
    #6668

    Anandtech: Razer Updates The BlackWidow Chroma Keyboard: New Ergonomics And Switch Ch

    The Razer BlackWidow keyboard has been around since 2010, and it is one of Razer’s most popular gaming keyboards. Today Razer is announcing an updated version of this keyboard, which adds more choices for their buyers.
    The first may seem like a small thing, but they’ve added a magnetically attached wrist rest, similar to some of their other models like the Razer Ornata. Those who prefer a wrist rest can leave it attached, and those who prefer a smaller keyboard can take it off. I’ve not used this model, since it is just announced, but the magnetic attachment of the wrist rest on the Ornata seems to hold it securely. This is a nice value-add for Razer, and should please many buyers.
    Their big news though is that the BlackWidow Chroma V2 now comes with a third keyboard switch option, in the new Razer Yellow switch. Razer started manufacturing their own keyboard switches in 2014, and they currently offer both a Green and Orange model, and both offer a tactile feedback. The new Razer Yellow switch is a linear, and silent design, and features a reduced travel distance. Razer says this shorter throw allows for faster key presses, and a quieter design should be welcome news to many people who want a mechanical keyboard with a bit less noise.
    The Green switches have a 50 gram actuation force, whereas the Orange and Yellow both have a lighter 45 gram actuation force, and all Razer mechanical switches are rated for an 80 million keystroke lifespan.
    As with most Razer products, the BlackWidow V2 features Razer’s Chroma lighting, with individually backlit keys offering 16.8 million colors per key, as well as the Chroma effects powered by their Synapse software. It also features 10 key roll-over anti-ghosting, and offers fully programmable keys with on-the-fly macro recording options. The USB keyboard features 1000 Hz polling, a braided USB cable, and a USB pass-through and 3.5 mm 4-pole audio pass-through jack.
    Despite the included wrist rest, the new BlackWidow V2 Chroma keyboard has a price that is unchanged from the previous model, and select models are shipping now for U.S. $169.99 / EU €199.99.
    Source: Razer


    More...

  9. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,809
    Post Thanks / Like
    #6669

    Anandtech: NVIDIA Releases 378.49 WHQL Driver Update

    A little over half a month in, and we have our first driver release from NVIDIA for the new year. Both camps were so rapid fire with driver releases last year that I was beginning to wonder what happened. I guess between game releases slowing down and the holidays those driver developers didn’t want code us new updates for their own entertainment. Teasing aside, the update today isn’t joking material. The latest update not only gives us our bug fixes and game support, but possibly enough SLI profilers to make multi-GPU gamers happy.
    We are getting a new branch today with driver release 378. For our fixed issues, this time around we have random flashes from Just Cause 3 and flickering faces in Assassin’s Creed – Syndicate. There is also some SLI induced flickering in both Battlefield 1 and Hitman that has been fixed. Battlefield 1 has also received a fix for rain puddles that were appearing dark. Lastly, NVIDIA has issued a fix for work unit errors in Folding @ Home; fingers crossed this does away with Folding @ home issues, people have clamoring for this one for months.
    For extra features, 378.49 adds support for the recently launched GeForce GTX 1050 and 1050Ti notebook cards. Game ready support is also bundled in for Resident Evil 7 Biohazard, the Conan Exile Early Access, and the For Honor closed beta. Not to make light use of their one month break since the last driver release NVIDIA has also added or updated the SLI profiles for the following games:

    • Battlefield 1
    • Deus Ex: Breach Standalone - added DirectX 11 profile
    • Diablo III - added DirectX 11 profile
    • Dreadnought (2016) - added DirectX 11 profile
    • LEGO: Minifigures Online - added SLI-Single profile
    • Sid Meier's Civilization VI
    • Shooter Game (HDR) - added DirectX 11 profile
    • Sniper Elite 4 - added DirectX 11 profile
    • Space Hulk: DeathWing - added SLI-Single profile
    • Tom Clancy's Ghost Recon: Wildlands
    • Watch Dogs 2

    This is a notably bigger list of SLI profiles than we typically see. I couldn’t say whether this is a re-ignited initiative, or just a consequence of the new driver branch. Regardless this gives SLI users more to be excited for.
    Anyone interested can download the updated drivers through GeForce Experience or on the NVIDIA driver download page. More information on this update and further issues can be found in the 378.49 release notes


    More...

  10. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,809
    Post Thanks / Like
    #6670

    Anandtech: Microsoft Readies Game Mode For Windows 10 Creator’s Update

    The Windows 10 Creator’s Update was announced in October, at the same event where the Surface Studio was launched. It promises many new features for makers, but when makers want to unwind, they want to play a PC game. Today Microsoft is giving a glimpse at one of the new features also coming with the Creator’s Update, and it is Game Mode.
    Microsoft wants to improve the overall gaming experience, and they have focused on several areas where gaming on the PC is let down, especially compared to the console where the experience is known to all parties, be them developers, or end users, well ahead of time. With Game Mode, Microsoft is continuing its steps toward bridging the divide between the gaming PC, and the Xbox.
    Some parts of Game Mode have already appears in the last several builds of the Windows 10 Insider Preview, but today the full Game Mode experience will be launching alongside the latest Fast Ring preview.
    Game Mode is an optional setting, which can be leveraged for either Win32 games, or UWP games. The experience will be better on UWP games, only because a UWP game has known limits on what is running, whereas a Win32 game is boundless. When enabled, Game Mode dedicates more of the CPU and GPU time to the game when it is in the foreground, which should, in theory, help with overall game performance. In an interview yesterday, Kevin Gammill, Partner Group Program Manager, Xbox Platform, discussed how this helps performance. Kevin was less concerned about peak framerate, but discussed how Game Mode can assist with a more consistent framerate, meaning less stops and stutters when the action gets intense.
    Game Mode settings in the Game Bar, not enabled yet
    Game Mode will set the CPU core affinity, and thread priority, the maximize the CPU resources dedicated to the game. Microsoft has found that there is a lot of thread contention when gaming, often from programs and resources that are not part of the gaming experience. The idea of a higher priority thread is not new, but enabling it on-the-fly automatically is a nice way to take advantage of this feature. System resources for other applications will be diminished, of course, since there is only so much CPU time available, so background activities that require a lot of CPU time are going to suffer. Game Mode can be disabled or enabled as needed though, allowing some flexibility here. The same idea is done on the GPU, where more GPU time slices are allocated to the game. The fundamentals are similar to how the Xbox One operates when gaming.
    Game Mode will work in conjunction with other technologies which make gaming on the PC an easier experience, such as NVIDIA’s GeForce Experience, which will optimize games for NVIDIA based cards.
    Microsoft has been heavily updating the gaming capabilities of Windows, ever since the launch of Windows 10, and Game Mode appears to be another nice addition. It should be available tomorrow in the next Fast Ring build of the Windows Insider Preview.
    Source: Microsoft



    More...

Thread Information

Users Browsing this Thread

There are currently 53 users browsing this thread. (0 members and 53 guests)

Tags for this Thread

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
Title