Page 587 of 1210 FirstFirst ... 874875375625775825835845855865875885895905915925976126376871087 ... LastLast
Results 5,861 to 5,870 of 12096

Thread: Anandtech News

  1. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,809
    Post Thanks / Like
    #5861

    Anandtech: Renice Announces X9 Military-Grade Rugged SSDs with R-SATA Connectors

    Shenzhen Renice Technology, a little-known maker of special-purpose SSDs from China, has introduced its new family of drives designed for military and rugged applications. What makes these new drives notable is that they are designed to withstand harsh environments, utilizing special R-SATA connectors as well as custom controllers. On the NAND side the drives are build around SLC, MLC or pseudo-SLC NAND flash memory to provide the required balance between endurance, price and performance.
    The Renice X9 R-SATA SSDs are powered by the company’s own controller, the RS3502-IT, as well as SLC or MLC NAND flash memory from undisclosed manufacturers. The drives are made in accordance with the MIL-STD-810F spec to withstand harsh environments and poor handling, and feature integrated power failure protection as well as 256-bit AES encryption with several secure erase functions (both software and hardware). The Renice X9 SSDs use the company’s proprietary 15-pin R-SATA connectors as well as special 7-pin SATA power ports, which limits compatibility of the drives to applications that feature the same connectors.
    The SLC-based X9 R-SATA drives are offered in 128 GB – 1 TB configurations, whereas MLC-powered SSDs are available with 256 GB – 2 TB capacities. If required, Renice can use MLC NAND in pseudo-SLC mode in order to provide higher endurance and performance at a predictable cost. The Renice X9 comes in 2.5” form-factor with 7, 9 or 15 mm thickness, depending on the exact capacity configuration.
    The manufacturer claims that the X9 SSDs have maximum sequential read speed of up to 530 MB/s and maximum sequential write speed of up to 500 MB/s along with 0.1 ms access time. The company does not specify random read/write performance of the drives, but claims that it is “excellent”. Keeping in mind that X9 SSDs can use different types of NAND flash memory and come in various configurations, random I/O performance of different models is likely to vary greatly.
    X9 2.5" R-SATA SSD Specifications
    Renice X9
    Capacity 128 GB - 1 TB (SLC)
    256 GB - 2 TB (MLC)
    Type of NAND SLC
    MLC
    pseudo-SLC
    Controller RS3502-IT
    Interface R-SATA (proprietary) 6 Gbps
    Form-Factor 2.5" 7/9/15 mm
    Maximum sequential read/write 530/500 MB/s
    DRAM Cache Supported
    AES 256-bit
    Overvoltage Protection +
    Power Loss Protection Yes, features four supercapacitors
    Secure Erase Hardware and Software
    MTBF 4 Million Hours
    ECC 80 bits at 1024
    Vibration 16 G (10 - 2000 Hz)
    Shock 1500 G at 0.5 ms half sine wave
    Humidity 5 - 95%
    Operating Temperatures -40ºC ~ +85ºC (industrial)
    -55ºC ~ +120ºC
    Power Consumption 10 W (2 TB)
    With regards to reliability, the Renice X9 drives boast a MTBF of over four million hours, while their physical tolerance is rated for 16.4G (10 – 2000 Hz) sustained vibration and 1500G (@ 0.5 ms half sine wave) shock, something that typical drives simply do not offer (Intel's DC S3710 SSD, by comparison, can tolerate 2.17 G operating vibration and can survive 1000 G shock). The X9 SSDs are designed for operating temperatures ranging from -40°C to +85°C (industrial-grade) or even -55°C to+125°C (military-grade), which is in line with other SSDs for extreme environments (such as those from Amtron or Foremay). Meanwhile, datacenter drives like the DC S3710 can operate when its temperature is between 0ºC and +70ºC.
    Given that Renice is a Shenzhen-based company, it is admittedly unlikely that the X9 SSDs will be used in military or defence applications outside of China. Nonetheless, since the drives are designed to withstand harsh environments in general, the X9 SSDs can be used in aerospace, industrial, transportation, outdoor storage and other systems that are subject to extreme temperatures, severe vibrations and other types of harsh environments.
    Exact pricing of the Renice X9 depends on memory configuration and additional features that the manufacturer may offer (e.g., extended operating temperatures). Typically, rugged drives cost significantly more than typical SSDs.
    Gallery: Military Grade X9 2.5" R-SATA SSD (With Ruggedized SATA Connector)




    More...

  2. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,809
    Post Thanks / Like
    #5862

    Anandtech: The Tesoro Sagitta & Thyrsus Gaming Mice Capsule Review

    Tesoro Technology (not to be confused with Tesoro Corporation of petroleum products) is a relatively new manufacturer of computer peripherals, founded back in 2011. Our first review of their products came a year ago, with the Tesoro Lobera Supreme Mechanical Keyboard. Since then, even though the company already had many products available, they have made some serious investments, overhauling their website and releasing several new peripherals. Nearly half of those new releases were gaming mice, and today we are having a quick look at two of them in this capsule review, the Sagitta Spectrum and the Thyrsus Spectrum.
    Tesoro likes giving their products strong, historical names, and these two mice are no exception. For keyboards, the company is usually borrowing the names of historical and legendary swords. For their mice, the naming scheme is a little more complicated. Sagitta stands for "arrow" in Latin and is a small constellation (not to be confused with Sagittarius). The Thyrsus is a mythical Greek staff wielded by the followers of the ancient god Dionysus, a symbol of prosperity and pleasure. Tesoro attaches the "Spectrum" classification to all of their RGB lighting products. Whether these two mice can live up to their showy names, we will find out in this review.
    Packaging & Bundle

    Both mice share the same trapezoid cardboard packaging, the top cover of which can be opened to reveal the mouse and allows for some basic grip testing above the transparent plastic shroud. Pictures of the mice decorate their packaging and the most basic features are being clearly listed.
    Tesoro supplies only a quick start guide and a leaflet with either the Sagitta or the Thyrsus. There are no accessories or even media accompanying either of the mice. For proper functionality, their software needs to be downloaded from the company's website.
    The Tesoro Sagitta Spectrum Gaming Mouse

    Physically, the Sagitta Spectrum is based on a very simple design, heavily resembling the legendary Microsoft Intellimouse. The top matte black cover has a soft texture and the sides feature a reflective piano black finish. Despite the symmetrical shape, the Sagitta is not meant to be used by left-handed users, as the two thumb keys are on the left side of the mouse. A third extra button can be found right beneath the scroll wheel.
    At the underside of the Sagitta, we can only see the generously large gliding pads and the PixArt 3310 Optical 5000 DPI sensor. Gamers will enjoy seeing that an optical sensor is being used, as it is supposed to neutralize hardware acceleration and offer better accuracy.
    Once powered on, the sides of the scroll wheel and the Tesoro logo at the front of the Sagitta are illuminated. The intensity and color of the lighting can be controlled via the software, individually for each of the two lighting points.
    The software that is supplied with the Sagitta Spectrum may not be the most advanced that we have ever seen, but it is well-written and much better than we originally anticipated. The main tab of the software allows the user to program different profiles and sync them to specific programs, automatically switching the profile when a program/game has been launched. All of the buttons can be reprogrammed or even disabled. The user can select from mouse, keyboard and multimedia commands, launch external software, switch profiles or activate programmed macros.
    The macros can be recorded "on the fly" or through the fourth tab in the software. Unfortunately, the software records only keystrokes and mouse clicks, not mouse movements, relative or absolute. Relative mouse movements can be inserted to a macro after it has been programmed but, even if relative mouse movements would work for the kind of macro you want to program, it is exceedingly difficult to guess exactly how many pixels you want the mouse to move. Technically, the software can only be effectively used to record mouse clicks and keystrokes and, if required, to adjust the delays afterwards in order to speed up the execution of the macro.
    Gallery: Sagitta Spectrum Software


    The second tab of the software is the "Performance" tab, where parameters such as the DPI settings and the polling rate can be adjusted. There is no way to reduce or increase the number of DPI stages, which is set to four. The scroll and double click speeds can be adjusted, as well as the software acceleration and the lift height. The polling rate can be reduced down to 125 Hz from 1 MHz, although there is no real reason why it should be reduced with a wired mouse, unless it is to combat a compatibility problem.
    Finally, the lighting tab allows the user to select the color and intensity of the LED lighting beneath the company's logo and the scrolling wheel. The software offers nine colors as a basic template but allows virtually any color of the RGB scale to be selected. Aside from static lighting, the software also has four special effects pre-programmed, "breathing", "loop", "rainbow" and "Tesoro Purple", without the possibility to manually program or add more. The lighting effect and/or color of the scrolling wheel and the main logo is programmed individually, meaning that the user can program a different color/effect at each of the two lighting points.
    The Tesoro Thyrsus Spectrum Gaming Mouse

    Although they share the same colors, the Thyrsus is physically more complex than the Sagitta, sporting an asymmetric design with a bulky body, clearly favoring palm over claw gripping. The top cover of the mouse is sprayed with a matte black paint and the surrounded body has a reflective piano black finish. At the top of the mouse, the oversized main buttons envelop the scrolling wheel. A tiny extra button can be found beneath the main buttons, near the top of the body.
    Tesoro is marketing the Thyrsus towards MMO gamers, who frequently require a lot of buttons and macros. The company placed six thumb keys on the left side of the Thyrsus that can be programmed via the downloadable software. There are no buttons on the right side of the mouse.
    At the underside of the Thyrsus, we can see four gliding pads, two of which are highly asymmetric. If those are worn, we suspect finding proper replacements will be significantly difficult. Unlike the Sagitta, the Thyrsus does not have an optical sensor but rather a laser sensor. The ADNS 9800 sensor is supplied by Avago, has a maximum resolution of 8200 DPI and, according to the manufacturer's specifications, zero hardware acceleration.
    Exactly as with the Sagitta, the sides of the scroll wheel and the Tesoro logo at the front of the Thyrsus are illuminated. However, the lighting cannot be individually controlled via the software, meaning that programmed settings affect both lighting points simultaneously.
    At the time of this review, the software that Tesoro supplies for the Thyrsus is very poor. It even lacks an installer, launching as a portable application. This actually may be an advantage if someone wants to pick up their mouse and use it with another system, for example at a friend's house, but we reckon that there will be very few such cases, in contrast with the majority of users who would prefer (or at least are accustomed to) installable software.
    There are only two tabs in the software, with multiple options crammed together. The "Basic Config" tab allows the users to modify the actions of every button on the mouse, its polling rate and DPI level settings, as well as the lighting. Almost every option in this tab is greatly restricted. There are five "settings", or profiles, which cannot be synced with applications and need to be changed manually. There are six DPI levels, the number of which cannot be reduced or increased, only their sensitivity level. Finally, the lighting options allows the user to select either two basic lighting effects or a single, static color. The static color can be programmed by controlling the intensity of the red, green and blue LED, which is adjustable from 0 to 3 for each color, meaning that there are "only" 64 possible configurations. Although 64 configurations are more than enough, the lack of a preview means that the user has to effectively guess the output color, as very few people can intuit the end result of having, for example, red, green and blue LEDs at 33%, 33% and 66% output respectively.
    The "Advanced Config" tab is not really advanced at all. There are a few options for adjusting the cursor, scrolling and double click speeds, but no advanced options regarding the sensitivity, acceleration and lift height, even though the sensor supports those. Finally, there is a very basic macro recorder that can only record a few keystrokes per macro, then writes the macro to a file.
    Final Words & Conclusion

    As with all of the peripherals we review here at AnandTech, I always try to use each device over the course of a few days. Especially for mice, this allows me to gain personal perspective on the long term comfort and learning curve. In the case of Sagitta and Thyrsus, as they are gaming mice, I tried to use them as much as possible for gaming sessions, logging some hours of gameplay in an action MMORPG and a real-time strategy game.
    The Sagitta performed very well during the gaming sessions. Even though I am used to heavier mice, the shape of the mouse was comfortable and learning curve was very short. I would not go quite as far as to claim that it felt like an extension of my hand, but the Sagitta is comfortable and performs well for casual gaming sessions. The low weight can reduce fatigue during very long gaming sessions, but some advanced users might not enjoy the impact this may have on their aiming accuracy.
    The software that Tesoro supplies for the Sagitta is not perfect, but it is responsive and has an adequate number of options. It was easy to adjust the settings of the mouse to my personal requirements and program a profile that would initiate once the game has been launched, changing the actions of the two thumb buttons to macros. I found the included macro recorder lacking, as my macros almost always require the input of mouse movements as well, but it was no problem to assign executable macro files to the buttons.
    We found the retail price of $60 for the Sagitta Spectrum to be good for a gaming mouse with an optical sensor, Omron switches and RGB lighting. There are a few more options around this price range, but good gaming mice with optical sensors usually start at the $80 mark, making the simple and very effective Sagitta a rather competitive product.
    Buy Tesoro Sagitta Spectrum on Amazon.com
    Even though the Sagitta performed very well, we cannot claim that the Thyrsus impressed us. The body of the Thyrsus is bulky but exceedingly light for its size, making it difficult to perform very precise movements with. Its design forces the user to adopt a palm grip, as it is next to impossible to hold it with a claw grip and be able to effectively press all six of the thumb buttons. Even with a palm grip, the user needs to get accustomed with gripping the Thyrsus properly, counteracting the force of the thumb with the pinky finger in order to keep the mouse from moving. Even though I am accustomed to using mice with many thumb buttons, such as the Logitech G602 and the Corsair M95, I personally found this difficult, and even after a couple of hours of gameplay the Thyrsus kept feeling uncomfortable to use.
    Considering that the software that Tesoro currently supplies with the Thyrsus is just a portable application and that the software of the simpler Sagitta is much better, I am hopeful that the company will introduce a proper software package in the near future. The portable version that is currently available for download from the company's servers is certainly not adequate for a gaming mouse, especially for an advanced product with six thumb buttons.
    The Thyrsus is a good product in terms of quality, even if (in my opinion) it is too lightweight for a MOBA/MMO gaming mouse. However, considering the retail price of $70 and the currently very poor software, it is very hard to recommend it over the multiple available alternatives, such as Corsair's Scimitar ($60) and Logitech's G600 ($50).
    Buy Tesoro Thyrsus Spectrum on Amazon.com


    More...

  3. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,809
    Post Thanks / Like
    #5863

    Anandtech: Oculus Rift Launch Day News: New AMD & NVIDIA Drivers; Async Timewarp & Pl

    After just over three and a half years and a Facebook acquisition in between, Oculus’s first-generation Rift headset is launching today. We took a hands-on preview of the headset at GDC 2016 earlier this month and will have a formal review later, but in the meantime for those lucky backers or early pre-order customers who will be receiving their units this week, let’s talk drivers and software.
    As you’d expect for such a high-profile device launch, both AMD and NVIDIA have new drivers being released in part to update their support for the Rift, along with blog posts lauding the release of the device. For both companies, the launch of VR headsets represents the opening of what could potentially be a large and profitable market for high-end GPUs. The rendering requirements of VR – both in resolution and framerates – greatly surpass the common 1080p60 monitor, and the novelty of VR could further spur on additional hardware sales. So both GPU vendors are taking the launches of VR headsets over the next two weeks very seriously.
    Launch Day Drivers

    Starting with AMD then, the company has released their Radeon Software 16.3.2. The big news here is of course official launch support for the Rift, including Oculus’s new 1.3 SDK (more on this in a bit). This driver is also adding support for the HTC Vive ahead of its launch next week, and AMD’s Radeon Pro Duo video card. The latter still does not have an official release date, but given the added driver support and Q2 release date, it’s clearly going to be soon.
    Along with the driver release, AMD’s graphics technology group has also released a blog post on their website dubbed “Asynchronous Shaders Evolved”. In it the company offers a recap of their asynchronous shader technology implementation, and further reveals that they have assigned a formal name to their async shader prioritization capability: Quick Response Queue. AMD first talked about this ability last year when they first began discussing async shading with the public, and now in their recent drivers they have enabled the feature. Quick Response Queue goes hand-in-hand with asynchronous timewarp, which ideally is executed as late as possible and as quickly as possible to produce best results, in this case allowing AMD to send the job through at high priority without resource sharing unnecessarily slowing down the job. Do note however that Quick Response Queue is a GCN 1.1+ feature, though for the purposes of VR in particular, all of AMD’s GCN 1.0 products fall below Oculus’s recommended minimum of a Radeon R9 290.
    Meanwhile over at NVIDIA, they too have a new driver release with 364.72, the third release from the R364 branch. 364.72 brings launch support for both the Oculus Rift and next week’s HTC Vive, along with the various launch titles for those headsets, and all of the features present in the latest iteration of NVIDIA’s VRWorks technology. As for non-VR gamers, this is also the game ready launch driver for Dark Souls III, Killer Instinct, and Quantum Break.
    The company has also published a separate blog post celebrating the launch of the Rift. In the post NVIDIA announces that GeForce Experience’s game settings optimization service now supports VR settings as well, allowing the software to dial in NVIDIA’s optimal settings for playing various games in VR. Only a limited number of games have VR settings available at this time, but NVIDIA seems to have hit a decent chunk of the graphically strenuous titles – EVE: Valkyrie, Lucky’s Tale, and Chronos – while promising to add further VR games in the future.
    News From Oculus: Async Timewarp Now In Production, Non-Oculus Store Applications To Be Flagged As “Unknown”

    Tying all of this launch day activity together, Oculus has released a new blog on their developers site outlining the current state of asynchronous timewarp support for the Rift. After initially announcing their implementation of the technology just a bit over a year ago, Oculus has promoted the technology to production status for the Rift on Windows. In the post they outline a bit of the work they had to do to implement async timewarp on Windows – since it’s not a real time OS, they can’t necessarily count on getting resources allocated or jobs completed as quickly as they’d like – which in turn has involved working with AMD, NVIDIA, and Microsoft to get both OS GPU scheduling and each vendor’s GPU drivers up to snuff to handle GPU prioritization and pre-emption as they expected to use it. The blog post also confirms that Oculus is using AMD’s LiquidVR and NVIDIA’s VRWorks technologies respectively to expedite the process and access functionality not normally available via DirectX 11, which has no real concept of asynchronous shading.
    The blog post also confirms the technical requirements for the async timewarp. The feature works as far back as Windows 7, but owing to the significant GPU scheduling updates made with Windows 10 and its underlying WDDM 2.0 driver stack, the feature works best on Windows 10, specifically the most recent update (10586.164). In the long run here I expect that Windows 10 will be the primary platform for VR – teething issues and all – due to the combination of WDDM 2.0 and ultimately the greater flexibility offered with DirectX 12.
    Meanwhile, now that async timewarp has been promoted to a production feature, Oculus is announcing that they are enabling it by default in the Oculus VR SDK 1.3. Games using this SDK will not need to do anything special to make use of the feature, which means this should allow for a fairly rapid rollout of the feature. However the post also implies that only games that are compiled against the 1.3 SDK get this feature, which is notable since the SDK was only released today. As a result it’s not clear whether any or all launch titles have async timewarp enabled at this time, or whether it’ll need to be patched in with a future game update.
    On a final note about async timewarp, in their blog post Oculus also spends a bit of time discussing how async timewarp works and what it’s used best for. There’s an especially good passage outlining how async timewarp’s re-warping capability to cover for missing frames (as opposed to updating head-tracking of a frame at the last second with an initial warp) is meant to be used as a failsafe option, and that developers shouldn’t use it to try to get away with games that run below 90fps. I especially like this passage – it’s the greatest summary of why you can’t cheat your way to VR that I’ve ever read – so I’ve gone ahead and reprinted it below.
    However, ATW is not a silver bullet. Failing to maintain a consistent, full frame rate may produce visible artifacts including noticeable positional judder, particularly in the near field of view. An application that falls below 90fps rendering will get re-warped in time to avoid rotational judder, but while orientation latency is kept low and smooth, animation and player movement may judder in lock-step with missed frames. For these reasons we continue to recommend that developers do not rely on ATW to save them from low frame rate.
    Finally, on a not-quite-hardware bit of news, Epic Games’ engine guru Tim Sweeney has pointed out that with the release of the Rift and the 1.3 SDK, Oculus has implemented new security and content policies on the platform. Applications compiled with the Oculus SDK but not distributed through the Oculus Store will be flagged as coming from an unknown source, and by default will be blocked. At the same time it will still be possible to run these applications, but end-users will first need to enable applications from Unknown Sources to have them unblocked.
    At first glance this appears to be an attempt by Oculus to encourage developers to adhere to their development guidelines while also building up their platform. Ala the mobile app stores, Oculus will be reviewing applications and approving/rejecting submissions based on content restrictions and adherence to best developmet guidelines. Ideally this should result in a more consistent experience for users due to applications not being allowed that don't follow Oculus's development guidelines, although the content aspect of the review process is also going to be particularly important for anything that fails their content rules (and as such can never get approved), as it means those applications will have to go the Unknown Sources path. Meanwhile, the default blocking of applications compiled via the SDK but not sold on the store will also serve to encourage developers to use the Oculus platform and distribute their applications through the Oculus Store, rather than leaving Oculus out and using another store entirely.
    To fray some of the concerns about the Oculus Store, Oculus is allowing developers to request keys for free, which can then be sold to users via other stores. The end result is very similar to Steam, allowing Oculus platform applications to be sold at third party stores while retaining their use of the Oculus platform. However I can’t recall a parallel of a Windows peripheral developer blocking third party applications by default, as this is typically the domain of OS vendors running closed platforms.


    More...

  4. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,809
    Post Thanks / Like
    #5864

    Anandtech: The Razer Blade Stealth Review: Razer Takes On The Ultrabook

    Razer has traditionally been a company focused on gaming. In fact, their tagline is “For Gamers. By Gamers.” So when Razer announced at CES that they were building an Ultrabook – a product category whose size and power limitations are typically the antithesis of gaming – it was a bit surprising. Razer decided it was time to branch out into more of the mainstream of PC hardware, but of course with the Razer twists they are known for. The Razer Blade Stealth is not your typical Ultrabook, and one of the biggest twists of all is that it can be docked to a desktop GPU to actually enable gaming.

    More...

  5. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,809
    Post Thanks / Like
    #5865

    Anandtech: Micron Begins to Sample GDDR5X Memory, Unveils Specs of Chips

    This past week Micron has quietly added its GDDR5X memory chips to its product catalogue and revealed that the DRAM devices are currently sampling to partners. The company also disclosed specifications of the chips they currently ship to allies and which potentially will be mass-produced later this summer. As it appears, the first samples, though running at much higher data rates than GDDR5, will not be reaching the maximum data rates initially laid out in the GDDR5X specification.
    The first GDDR5X memory chips from Micron are marked as MT58K256M32JA, feature 8 Gb (1GB) capacity, and are rated to run at 10 Gb/s, 11 Gb/s and 12 Gb/s in quad data rate (QDR) mode with 16n prefetch. The chips use 1.35 V supply and I/O voltage as well as 1.8 V pump voltage (Vpp). Micron’s GDDR5X memory devices sport 32-bit interfaces and come in 190-ball BGA packages with 14×10 mm dimensions. As reported, the GDDR5X DRAMs are manufactured using 20 nm process technology, which Micron has been using for over a year now.
    The GDDR5X memory standard, as you might remember from our previous reports, is largely based on the GDDR5 specification, but has three crucial improvements: significantly higher data-rates (up to 14 Gb/s per pin with potential up to 16 Gb/s per pin), higher and more flexible chip capacities (4 Gb, 6 Gb, 8 Gb, 12 Gb and 16 Gb capacities are supported) and better energy efficiency thanks to lower supply and I/O voltage.
    The first samples of GDDR5X memory chips fully leverage key architectural enhancements of the specification, including quad data rate (QDR) data signaling technology that doubles the amount of data transferred per cycle over the memory bus (compared to GDDR5) and allows it to use a wider 16n prefetch architecture, which enables up to 512 bit (64 Bytes) per array read or write access. However, the maximum data rates of Micron's sample chips are below tose initially advertised, possibly because of a conservative approach taken by Micron and its partners.
    The addition of GDDR5X samples to Micron’s parts catalog has three important implications. First, the initial development of Micron’s GDDR5X memory chips is officially complete and the company has achieved its key goals (to increase performance of GDDR5X without increasing its power consumption). Second, one or more customers of Micron are already testing processors with GDDR5X memory controllers, which means that certain future GPUs from companies like AMD or NVIDIA do support GDDR5X and already exist in silicon. Third, the initial GDDR5X lineup from Micron will consist of moderately clocked ICs.
    GPU Memory Math
    AMD Radeon
    R9-290X
    NVIDIA GeForce
    GTX 980 Ti
    NVIDIA GeForce
    GTX 960
    GDDR5X 256-bit
    interface
    GDDR5X
    256-bit
    interface
    GDDR5X 256-bit
    interface
    GDDR5X 128-bit
    interface
    GDDR5X 128-bit
    interface
    Total Capacity 4 GB 6 GB 2 GB 8 GB 4 GB
    B/W Per Pin 5 Gb/s 7 Gb/s 7 Gb/s 12 Gb/s 11 Gb/s 10 Gb/s 12 Gb/s 10 Gb/s
    Chip capacity 2 Gb 4 Gb 4 Gb 8 Gb
    No. Chips/Stacks 16 12 4 8 4
    B/W Per Chip/Stack 20
    GB/s
    28
    GB/s
    28
    GB/s
    48
    GB/s
    44
    GB/s
    40
    GB/s
    48
    GB/s
    40
    GB/s
    Bus Width 512-bit 384-bit 128-bit 256-bit 128-bit
    Total B/W 320
    GB/s
    336
    GB/s
    112
    GB/s
    384
    GB/s
    352
    GB/s
    320
    GB/s
    192
    GB/s
    160
    GB/s
    Estimated DRAM
    Power Consumption
    30 W 31.5 W 10 W 20 W 10 W
    Thanks to GDDR5X memory chips with 10 Gb/s – 12 Gb/s data rates, developers of graphics cards will be able to increase peak bandwidth of 256-bit memory sub-systems to 320 GB/s – 384 GB/s. Which is an impressive achievement, because this amount of bandwidth is comparable to that of AMD’s Radeon R9 290/390 or NVIDIA’s GeForce GTX 980 Ti/Titan X graphics adapters. The latter use 512-bit and 384-bit memory interfaces, respectively, which are quite expensive and intricate to implement.

    Micron originally promised to start sampling of its GDDR5X with customers in Q1 and the company has formally delivered on its promise. What now remains to be seen is when designers of GPUs plan to roll-out their GDDR5X supporting processors. Micron claims that it is set to start mass production of the new memory this summer, which hopefully means we're going to be seeing graphics cards featuring GDDR5X before the end of the year.
    More information about GDDR5X memory:




    More...

  6. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,809
    Post Thanks / Like
    #5866

    Anandtech: Price Check: Prices of Unlocked Intel Core i7-6700K, i5-6600K Continue to

    When Intel first introduced its high-end Skylake-S CPUs with unlocked multiplier about half a year ago, it was not easy to get them and for a long time the Core i7-6700K as well as the Core i5-6600K were overpriced and in tight supply. Their retail prices dropped and their availability generally improved earlier this year, but the Core i7-6700K was still rather expensive in some stores. Today, both unlocked Skylake-S chips are available and their prices no longer significantly exceed MSRPs. Nonetheless, despite the equal MSRP to the high-end desktop in the Core i7-5820K, the i7-6700K is still more expensive as the price continues to drop.
    Intel Core i7-6700K Getting Closer to Its MSRP

    Intel’s most powerful desktop quad-core processor for mainstream enthusiasts which is positioned below the HEDT platforms, the Core i7-6700K (four cores with Hyper-Threading, 4.0 GHz/4.20 GHz, 8 MB cache, Intel HD Graphics 530 core, unlocked multiplier), has an official MSRP of $350 according to Intel’s ARK. However, for a long time it was impossible to find at that price point: it used to cost $420 in December and up to $412 in February. Moreover, even overpriced, it was still hard to find at the biggest retailers.
    The availability of the Core i7-6700K (BX80662I76700K) improved this month. All major outlets monitored by NowInStock, including Amazon, B&H Photo, NCIXUS and Newegg had the chip in stock for $364 – $408. BestBuy lists the CPU for $409.99, can ship it in two days, but does not provide any store pickup options, which indicates that it does not have the product everywhere in its locations. Back in December, only Newegg had the Core i7-6700K in stock, so, it is evident that the U.S. retailers are getting more units through distribution.
    Amazon lists the Intel Core i7-6700K for $373 and the chip is in stock. According to CamelCamelCamel, a price-tracker that monitors Amazon and its partners, the Core i7-6700K was available briefly for $349.99 (i.e., at its MSRP) earlier this month and for $363.95 earlier this week. Moreover, some of Amazon’s partners even sold the part for $340 on March 23. Last month Amazon listed the Core i7-6700K for $365, but it was not in stock. By contrast, it is possible to get it from Amazon now, albeit at a higher price-point
    Newegg offers the Core i7-6700K for $369.99 and the CPU was in stock at press time. The chip used to cost $412 last month at Newegg, so, it is evident that the retailer significantly decreased its price in the recent weeks. According to PriceZombie, which monitors Newegg, it slashed the price of the i7-6700K to around $380 early in March, reduced it to $365 earlier this week, but then hiked the price to $369.99.
    It is evident that the Core i7-6700K is still sold at a premium, its prices at different locations are higher than Intel’s MSRP for the chip. Nonetheless, it is clear that a significant progress has been made and now the CPU is both widely available and no longer costs $50+ more than it should.
    Buy Intel Core i7-6700K on Amazon.com
    Intel Core i5-6600K Drops Below MSRP

    The Intel Core i5-6600K (four cores, 3.50 GHz/3.90 GHz, 6 MB cache, Intel HD Graphics 530, unlocked multiplier) is the other overclockable Skylake processor, and is popular among overclockers and enthusiasts due to its modern micro-architecture and performance in most consumer situations compared to the i7 variant. The processor is officially priced at $243, but it used to cost $290 in December. Moreover, only Newegg had the CPU in stock back then. By contrast, Intel Core i5-6600K (BX80662I56600K) is currently available from all major U.S. retailers for $239.99 - $254.99 right now, based on information from NowInStock web-site.
    Amazon has Intel Core i5-6600K processors in stock and offers them for $239.99, down from $249.99 several weeks ago. According to CamelCamelCamel, the retailer slashed the price of the product to a level below Intel’s MSRP only this week.
    As for Newegg, it sells the Core i5-6600K for $254.99, which is higher than Intel’s official price and slightly up from February. However, if you enter the EMCEHGT62 promo code (expires on 3/30) and subscribe to Newegg’s newsletter, you can get a $15 discount and obtain the chip for $239.99. Based on data from PriceZombie, the price of the i5-6600K at Newegg has been fluctuating from $250 to $255 for several weeks now.
    The exact reason why the Core i5-6600K CPU dives below its MSRP is unknown, but the broad availability of the chip intensifies competition between various retailers and they are willing to offer lower pricing in a bid to sell.
    Buy Intel Core i5-6600K on Amazon.com
    Intel Core i7-5820K Now Available for $349

    The Intel Core i7-6700K processor based on the Skylake micro-architecture has a number of advantages over previous-generation chips, but the entry-level high-end desktop processor, the Core i7-5820K CPU (six cores with Hyper-Threading, 3.30GHz/3.60 GHz, 15 MB cache, unlocked multiplier), still remains a very interesting product from many points of view. Moreover, it is still cheaper than the quad-core Core i7-6700K despite having more cores.
    The baby HEDT chip is now available for $349.99 from Amazon and Newegg, which is well below its official price of $396. The Core i7-5820K used to cost over $380 in February, so, it is evident that its price dropped substantially within several weeks. Keeping in mind that Intel is expected to update its high-end desktop family of processors with Broadwell-E offerings later this year, it is not really surprising that Haswell-E products are getting more affordable.
    Of course, to use the Core i7-5820K (BX80648I75820K) you will need an X99-based motherboard (on average, costs more than Skylake focused 100-series motherboards), an advanced cooler, and four DDR4 memory modules to maximize the available memory bandwidth. However, for $349 you get six Hyper-Threaded cores that can be easily overclocked and a capable motherboard (most new X99 have USB-C 3.1, but M.2 and Thunderbolt 3 options are varied). Moreover, the Intel X99 platforms will eventually support Broadwell-E processors later this year.
    Buy Intel Core i7-5820K on Amazon.com
    Intel’s 14 nm Supply Finally Meets Demand?

    Intel has confirmed multiple times in the recent quarters that demand for its Core i7 high-end CPUs as well as K-series chips with unlocked multiplier were at a record high. The unprecedented demand, as well as initially slow ramp of Skylake-S processors, had created shortages of unlocked Core i5/Core i7 chips leading to the high prices of such CPUs in late 2015.
    Starting from late Q3 2015, Intel has been gradually increasing production of its CPUs using 14 nm process technology. The company began to produce 14 nm processors in its Fab 24 manufacturing facility in Leixlip, Ireland in Q3 FY2015, in addition to its D1D, D1C and D1X fabs in Hillsboro, Oregon. Yields of the CPUs most likely have improved since their introduction in August as well as turning more wafers into desktop parts, leading to larger shipments of higher-end Skylake-S products. As a result, prices of the Core i5-6600K and the Core i7-6700K are decreasing.
    Nonetheless, there could be another reason why the prices of Intel’s CPUs are getting lower. Demand for PCs in the first quarter is traditionally below the demand in the fourth quarter. However, this year sales of PC components in Q1 have been slower than expected, according to market analysts. As a consequence, Intel (as well as Intel's distribution network) have more processors than they can sell, according to Stacy Rasgon, an analyst with Bernstein Research, citing Intel's reports.
    “First, inventories have ballooned over the last year. Internal inventories in Q4 were up >20% YoY on tepid revenue growth and have been outgrowing revenues for multiple quarters,” Mr. Rasgon wrote in a note for clients, reports Tech Trader Daily. “Second, receivables have once again spiked. Absolute receivable growth has far exceeded revenue growth over the last year, and days of receivables spiked in Q4, for only the second time in 10 years.”
    Moreover, last week analysts John Donovan, Paul Peterson, and Steve Mullane with BlueFin Research Partners reported that based on their checks Intel had cut production of certain CPUs in February and March compared to January. Since no details are known, it is possible that Intel was ceasing production of older products because of demand. However, since Intel’s inventories and receivables are at high levels, the company may simply not need to produce a lot of processors because the market needs to consume already produced material. Given the fact that the Core i5-6600K and the Core i7-6700K are Intel’s high-end 'mainstream' desktop processors today and their relatively high prices, it is unlikely that their shipments substantially exceed demand.
    Relevant Reading

    Skylake-K Review: Core i7-6700K and Core i5-6600K - CPU Review
    Comparison between the i7-6700K and i7-2600K in Bench - CPU Comparison
    Overclocking Performance Mini-Test to 4.8 GHz - Overclocking
    Skylake Architecture Analysis - Architecture
    Non-K BCLK Overclocking is Being Removed - Overclocking Update
    An Overclockable Core i3: The Intel Core i3-6100TE Review - Analysis of Overclocked Core i3 CPU


    More...

  7. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,809
    Post Thanks / Like
    #5867

    Anandtech: Toshiba Details Its PC Business Reorg: Set to Concentrate on Tecra and Por

    Toshiba this month revealed its finalized PC business strategy for the future. As expected, the company intends to cease selling consumer personal computers outside of Japan and plans to focus on corporate and business PCs going forward. Toshiba will sell existing consumer PCs in North America and Europe and will honor the warranties in the future. However, the company has no plans to introduce any new consumer models outside of its home market.
    Toshiba disclosed plans to reorganize its PC business in September, 2014. The company announced intentions to stop selling consumer computers completely and focus on business and corporate PCs instead. Toshiba said that the purpose of the reorganization was to ensure profitability of this business unit and improve competitive positions against companies like Dell, HP or Lenovo on the corporate PC market. Toshiba hopes that the new focus will help it to significantly increase its B2B (business to business) sales already in fiscal 2016 (which begins on April 1, 2016) and become profitable.
    For a number of years Toshiba’s PC business was focused on increasing market share, which means that the company had to develop two separate product families: one for business users and another for consumers. Due to tough competition, it is not easy to sell consumer PCs nowadays. Products families have to be broad, profit margins are razor thin and suppliers have to focus primarily on sales scale and volume. While Toshiba is known for affordable systems in the U.S., that business was not profitable for the company. This was was one of the reasons why Toshiba decided to cease selling its consumer PCs outside of Japan.
    Because of the reorganization, the company has reduced headcount of its PC business by 1300 people as well as eliminated multiple operation sites. Toshiba plans to offer a full range of corporate personal computers, tablets and workstations. In particular, the company will offer higher-performance notebooks under its Tecra brand, ultra-thin laptops will be sold under the Portégé trademark, whereas tablets and 2-in-1s will carry dynaBook and Portégé names.
    “Toshiba will concentrate on the B2B PC market globally by developing, manufacturing, and selling its Tecra and Portégé brands to the corporate market,” the company said in its statement.
    Right now Toshiba’s retail partners offer a variety of Satellite notebooks and other low-cost consumer PCs, including models based on Intel processors featuring the Broadwell micro-architecture. These systems will be available while the stock last and then customers interested in Toshiba PCs will have to buy Tecra, dynaBook and Portégé either directly from Toshiba or from various resellers. In short, Toshiba-branded PCs are not going away from the U.S., but they will not be available widely and will cost more than they do today. The company will honor all Satellite and other warranties.
    “Toshiba will continue selling its consumer notebooks through its retail partners as the company expands its corporate footprint,” the company said. “Customers can purchase Toshiba with confidence knowing their product warranties and service obligations will be honored.”
    To better address the PC market both in Japan and in other countries, Toshiba will establish Toshiba Client Solutions Co. later this week. Moreover, the company will continue to discuss further reforms of its PC business with third parties. There are rumors that Toshiba is negotiating strategic deals with other Japanese computer suppliers and investors, but so far nothing official has been revealed.


    More...

  8. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,809
    Post Thanks / Like
    #5868

    Anandtech: Microsoft Build 2016 Keynote Live Blog


  9. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,809
    Post Thanks / Like
    #5869

    Anandtech: Apple Announces the Safari Technology Preview

    Today Apple made an interesting announcement for developers regarding Safari. Safari is Apple's browser across all of their iOS and OS X devices, and the layout engine at its core is WebKit. WebKit was originally started as a project within Apple as a fork of KHTML, a layout engine developed by the KDE project. Today it's widely employed in many browsers on many platforms, with Google and Opera formerly using it and now utilizing a fork of WebKit called Blink.
    Traditionally developers who want to use the latest improvements to WebKit have had to download WebKit nightly builds, which as their name implies, are builds that reflect the latest changes to the WebKit code base and are released on 24 hour intervals. This allows developers to test and develop against new features being added to WebKit, which later make their way to Safari and other WebKit-based browsers as well. Distributing nightly builds for developers is a common practice for large software projects, but in the case of Safari and WebKit it was not ideal in many ways. For one, it essentially made the version of Safari on a computer use the new nightly WebKit back end that was installed, which can lead to annoyances when trying to compare between the existing public release and the nightly build. It also disables features like iCloud integration for tabs, bookmarks, passwords, etc, as the builds are not signed by Apple.
    The Safari Technology Preview is Apple's attempt to address some of these problems, and make it easier for developers to keep track of what changes are being made, and to submit feedback or bug reports based on what they experience. What the preview consists of is an application separate from Safari that uses a more up to date version of WebKit than what the public version of Safari that comes with OS X uses. It's available from Apple's developer website, and updates will come every two weeks via the Mac App Store. This makes the list of changes and additions easily accessible with each update, and because the builds are signed by Apple there's full support for iCloud integration. Having a separate application means that comparisons and regression testing between the current official version of Safari and one with a more up to date version of WebKit can be done easily.
    One important thing to note about the Safari Technology Preview is that, while the app is available from Apple's developer site, you don't need to be a registered developer paying the yearly iOS and OS X publishing fee to access it. Since the target audience consists mainly of programmers building websites and web applications, it doesn't make sense to limit it to developers building native apps for iOS and OS X.
    Apple is highlighting some key things that are new in the initial release of the Safari Technology Preview. The first is that it has what they claim to be one of the most complete implementations of ECMAScript 6 (ES6), which in less precise but simpler terms means the latest version of JavaScript, as JavaScript was standardized as ECMAScript and now can be considered an implementation of the standard itself.

    Image source: Mozilla
    ES6 comes with some key features for developers, including support for classes as part of the object oriented paradigm, iterators, and many new APIs. I am personally not a web developer, and the fact that JavaScript is just now adopting more explicit class declarations on top of the existing function prototype based declarations comes as quite a surprise to me.
    Another key feature of the Safari Technology Preview is the new B3 Just-In-Time (JIT) JavaScript compiler. B3 is the new compiler backend for WebKit's FTL JIT compiler which was introduced about a year ago with LLVM acting as the backend. At that time there was a great deal of info about the work that went into making LLVM, traditionally a production grade compiler for native applications, usable for compiling JavaScript on the fly within the constraints of something like a smartphone. Since Apple has been a major part of both WebKit and LLVM, using LLVM as the backend to achieve greater optimization of JavaScript code made sense. However, LLVM was architected as a compiler that would be used for optimizing and compiling code on large powerful desktop computers where power usage and compile times were not a large concern, as the code would simply be compiled and shipped to be run. In the context of a mobile device, you'll be visiting various sites and compiling a great deal of different JavaScript code, and so a different strategy needs to be employed.
    This is where B3 comes in. According to Apple, LLVM's optimizations often are overkill for the task of JavaScript compilation. There are cases where it's actually faster to just compile some lines of code and run them than to take time to optimize, compile, and then run. In these situations, there are performance gains to be made by moving away from LLVM, as you need to work on minimizing compile time rather than generating the most efficient code possible. On a high level, B3 looks at the JavaScript code that needs to be executed and decides whether it's actually worth optimizing it or not. For complex code that may be run many times it makes sense to spend the extra time optimizing, but for small groups of simple statements it may be better to just compile it without optimizations.
    Of course, a big question may be why you wouldn't just apply tweaks to LLVM. Apple says that B3 was designed from the ground up with a focus on quick JavaScript execution, but it wouldn't be surprising if there are ties to LLVM which already exists to provide a solid foundation.
    For developers and other interested parties looking for more info on B3 there's a post on the official WebKit blog about other improvements that have been made. They also highlight some improvements in compile time that have been observed with popular benchmarks, while also demonstrating the fact that performance doesn't regress from the LLVM backend despite the significant reductions in compile time. Right now B3 isn't fully ported to ARM64 and that will be necessary before we see it debuting on iOS.
    The last two major inclusions in the first release of the Safari Technology Preview are an updated IndexedDB implementation and support for the newest standard of Shadow DOM. The former is a way of storing data on a client device for quick access, and the changes are the result of developer feedback, with Apple claiming that the new implementation is more stable and better compliant with established standards. A simple explanation of Shadow DOM is that it provides a way for developers making websites and web apps to better define the style of widgets and controls while keeping them independent from other styling options that apply to the page.
    The Safari Technology Preview is available now from Apple's developer website. As mentioned before, it only needs to be installed from the developer site, with future updates coming every two weeks via the Mac App Store, with the latest changes to WebKit and Safari in tow.


    More...

  10. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,809
    Post Thanks / Like
    #5870

    Anandtech: Windows Adds Support For Bash Scripting

    With the goal of making more developers use Windows, and to help them move their workflows over to Windows, Microsoft has taken the step to enable Bash scripting natively in Windows 10. This will be a new subsystem, and not an emulation layer, with full access to native Windows functions such as the file system and APIs.
    Microsoft partnered with Canonical to provide an Ubuntu based subsystem into Windows. In the keynote, Microsoft spoke about how they have had lots of feedback regarding their Hosted Web App bridge which lets developers take web apps and provide them through the Windows Store as sudo-native apps. The Web Apps can have access to the Universal Windows Platform (UWP) APIs for things like Live Tiles and Cortana integration, but without a lot of the overhead of re-writing into a native app. But the feedback was that a lot of the development tools they use require Bash scripting making it difficult to do the development on Windows, hindering Web App adoption.
    Adding the Ubuntu subsystem into Windows is an interesting solution to this problem. Linux does a lot of things much differently than Windows, including having a case sensitive file system, among other things, so certainly some work would have been done on the back end to enable this in Windows.
    This, like many of Microsoft’s announcements over the last year or more, have been about making it easier for devs to work on Windows, and expanding the install base of targeted applications with bridges and Xamarin.
    I hope to have some more info on the Bash announcement in the next couple of days.


    More...

Thread Information

Users Browsing this Thread

There are currently 11 users browsing this thread. (0 members and 11 guests)

Tags for this Thread

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
Title