Page 589 of 1210 FirstFirst ... 894895395645795845855865875885895905915925935945996146396891089 ... LastLast
Results 5,881 to 5,890 of 12096

Thread: Anandtech News

  1. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,809
    Post Thanks / Like
    #5881

    Anandtech: Intel Rolls Out New PCIe SSDs for Cloud Datacenters

    Intel on Thursday introduced several new PCIe SSDs designed for cloud datacenters. The new drives increase capacities, shrink latencies and offer higher throughput in order to follow demands of new datacenters deployed by various companies these days. Some of the new SSDs are based on Intel’s 3D NAND memory, whereas other feature NVMe 1.2 technology and dual-port U.2 capability to increase performance of mission-critical data-storage applications.
    Intel hopes that in the future SSDs will be used not only to store hot, frequently used data, but also data that is currently stored on highly-reliable high-performance hard drives. To replace 10K and 15K RPM HDDs in the datacenter, Intel needs to offer improved reliability, high endurance, unbeatable performance, lower costs and additional features impossible on HDDs. The new SSDs from the company bring a number of improvements to the datacenter compared to previous-generation drives.
    The new Intel SSD DC P3320 and P3520 families of SSDs are based on 32-layer 3D NAND flash memory developed and produced by Intel and Micron. The drives are powered by unspecified controllers, and support end-to-end data protection as well as some other functions important in the datacenters. Intel claims that that its 3D NAND has better endurance than planar NAND flash memory, which is common knowledge at this point, but does not provide any exact numbers for their NAND. The DC P3320 comes in 2.5” or HHHL card form-factor and uses either PCIe 3.0 x4 or U.2 to connect to the host. The DC P3320 is being pitched as a step up from Intel's SATA-based datacenter SSDs, while the DC P3520 will presumably be replacing the DC P3500 series.
    Specifications of Intel DC P3320 SSDs
    450 GB 1.2 TB 2 TB
    Form Factor 2.5" drive 2.5" drive/HHHL card
    Controller unknown
    Interface U.2 U.2 or PCIe 3.0 x4
    Protocol NVMe
    DRAM unknown
    NAND 256 Gb MLC
    32-layer 3D NAND
    256 Gb MLC
    32-layer 3D NAND
    256 Gb MLC
    32-layer 3D NAND
    Sequential Read 1100 MB/s 1600 MB/s 1600 MB/s
    Sequential Write 500 MB/s 1000 MB/s 1400 MB/s
    4KB Random Read (QD32) 130K IOPS 275K IOPS 365K IOPS
    4KB Random Write (QD32) 17K IOPS 22K IOPS 22K IOPS
    Launch Date Q1 2016
    The DC P3320 SSDs offer a range of capacities, including 450 GB, 1.2 TB and 2 TB models and are designed for read-intensive applications. The new drives are rated to offer maximum sequential read/write speed of up to 1600/1400 MB/s. Maximum random 4K read/write speed declared by Intel for the DC P3320 is 365K/22K IOPS (input/output operations per second). Intel does not reveal any details about the DC P3520, but claims that these drives were designed to deliver “significant” performance and latency improvements over the DC P3320 (which probably indicates higher parallelism and higher capacities, which Intel does not want to talk about at the moment).
    Meanwhile the new Intel DC D3600/3700 SSDs (not to be confused with the P or S series) are designed for mission-critical storage applications that should function 24/7, which is why they utilize proven MLC NAND flash memory with high-endurance technology (HET) as well as controllers that support NVMe 1.2 technology with various high-availability features and support for up to 80 I/O queues. Intel does not disclose which controller it uses, but claims that they feature an integrated memory buffer and dynamic multiple namespaces management technology to improve efficiency of data management across drives in one machine, an exclusive feature (which potentially means that Intel uses a custom controller for these SSDs). The drives sport an active/active dual-port design that connects through a compatible backplane to two host systems simultaneously (which enables run-time recovery during failover when one of the hosts is unavailable) and support hot-plug capability. The DC D3600/D3700 drives also feature end-to-end data protection, power-loss data protection with self-test and thermal throttling and monitoring to ensure maximum reliability. The SSDs utilize PCIe 3.0 x4 interface and U.2 connectors. Since each drive has only a single U.2 connector, their dual-port mode relies on the backplane routing two PCIe lanes to each of the two host systems.
    Specifications of Intel DC D3600 and D3700 SSDs
    DC3700 800 GB DC3600 1 TB DC3700 1.6 TB DC3600 2 TB
    Form Factor 2.5" drive with U.2 interface
    Controller unknown
    Interface U.2/PCIe 3.0 x4
    Protocol NVMe 1.2
    DRAM unknown
    NAND MLC NAND with HET (high-endurance technology)
    Sequential Read 1900 MB/s 1800 MB/s 2100 MB/s 2100 MB/s
    Sequential Write 970 MB/s 940 MB/s 1500 MB/s 1500 MB/s
    4KB Random Read (QD32) 450K IOPS 450 K IOPS 470K IOPS 470K IOPS
    4KB Random Write (QD32) 65K IOPS 25K IOPS 95K IOPS 30K IOPS
    Launch Date Q1 2016
    Intel’s DC D3600/D3700 solid-state drives will be available in 800 GB, 1.6 TB (D3700) as well as 1 TB and 2 TB (D3600) configurations. According to Intel, the new SSDs, deliver sequential read speeds of up to 2100 MB/s and sequential write performance of up to 1500 MB/s. The new SSDs can also perform up to 470K random read IOPS (4KB) and up to 95K random write IOPS (4KB).
    Since SSDs with U.2 interface are not compatible with existing SAS or SATA backplanes (because they do not support PCIe), they need support from makers of storage solutions for datacenters. Intel claims that companies like EMC, Huawei, Quanta, Wistron and X-IO Technologies are ready to produce mission-critical storage ecosystem for PCIe-based SSDs, but does not provide further details.
    A hands-on look at a sample of the DC D3700 revealed that Intel has changed the design of the heatsink on the bottom of the drive to allow for airflow in two directions across the back half of the drive where the controller most likely resides.
    Intel did not touch upon price and availability details about its new SSDs, but expect them to arrive later this year. Keeping in mind that there are not a lot of mission-critical PCIe backplanes in the wild at the moment, it will take some time before Intel’s DC D3600/D3700 get more or less widespread.


    More...

  2. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,809
    Post Thanks / Like
    #5882

    Anandtech: The OCZ Trion 150 SSD Review

    The OCZ Trion 150 is the latest entry-level SSD from Toshiba's subsidiary. It makes almost no hardware changes from the Trion 100 aside from using Toshiba's newer 15nm TLC NAND, but it ends up being a very different drive, especially given current market conditions.

    More...

  3. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,809
    Post Thanks / Like
    #5883

    Anandtech: Lenovo Upgrades 2-Way ThinkStation Workstations with Intel Xeon E5 v4 CPUs

    Lenovo this week upgraded its range of 2-way workstations with Intel’s new Xeon E5 v4 microprocessors as well as recently announced NVIDIA’s Quadro M6000 24 GB graphics card. The new ThinkStation P910 and P710 machines rely on the chassis and design of their predecessors and have a lot of similarities with machines introduced in 2014. However, the new systems have a number of improvements in addition to more powerful CPUs and GPUs.
    Read our review of the design of the Thinkstation P900 (E5 v3 version) here: The Lenovo ThinkStation P900 Workstation Review: Design 101
    The new Lenovo ThinkStation P710 and P910 workstations are based on the Intel C612 platform controller hub (PCH) and are compatible with various Intel Xeon processors in LGA2011-3 packaging, including the latest Xeon E5 v4 (also known as the Broadwell-EP). Both systems support up to two Xeon processors, which gives their owners up to 44 physical cores in total (or up to 88 threads with the HyperThreading enabled). The P710 and the P910 machines can also house up to three NVIDIA Quadro or NVIDIA Tesla graphics or compute cards to handle professional graphics programs and simulation applications (while the P710 formally supports three PCIe x16 slots (x16/x16/x8), Lenovo itself does not install more than one M6000 card into the system). To better cool-down hot components (i.e., CPUs and GPUs), Lenovo’s P-series workstations feature a special “tri-channel cooling” technology, which can quickly increase airflow in one of the “channels” to apply cool air to particular pieces of hardware, but not to all components in the chassis, thus, avoiding excessive noise.
    The full-size Lenovo ThinkStation P910 features 16 DDR4 DIMM slots and can install up to 1 TB of RAM (thanks to the new Xeon CPUs, DDR4-2400 modules are now supported). The system can also be equipped with up to 14 internal storage devices (RAID 0, 1, 5, 10 is supported), including four M.2 SSDs (which are installed into special Flex adapters, which are then plugged into PCIe x8 Flex connectors) and up to 10 2.5” and 3.5” HDDs in order to provide both speed and capacity. The ThinkStation P910 also has Flex Bay, which is designed to support an optical drive and a 9-in-1 media card reader, or the Flex module, which holds a 9.5 mm ODD, a 29-in-1 media card reader, eSATA and Firewire connectors. In order to support power-hungry processors, graphics cards, massive amount of RAM and storage as well as other devices, the P910 comes with a 1300 W PSU that should deliver the right amount of power even to the most demanding components and configurations.
    The ThinkStation P910 should be very similar to the ThinkStation P900, but with a couple of important additions such as Thunderbolt 3 and Wi-Fi 802.11ac. While Intel's Haswell-EP and Broadwell-EP processors feature the same packaging and Intel said that the new chips should be compatible with the LGA2011-3 sockets, Lenovo itself does not offer its ThinkStation P900 with new CPUs. The Xeon E5 v4 will be installed exclusively into the P910 and P710 and it is unclear whether the PC maker plans to release new BIOS revisions for its previous-gen workstations to enable upgrades.
    The Lenovo ThinkStation P710 does not have the vast expansion capabilities of the P910 because it is generally more compact, nonetheless, it is still a very robust 2P workstation. The P710 sports 12 DDR4 DIMM memory slots and up to 384 GB of RAM, can be equipped with up to 12 storage devices (two M.2 SSDs [because the system has only one Flex connector], up to 10 2.5”/3.5” HDDs) and a Flex module. Since the P710 is positioned as a compact 2P machine, it can be equipped with a 490 W, 650 W or 850 W PSU in a bid to balance performance and noise levels.
    Both new ThinkStations from Lenovo support all the features and technologies of their predecessors, such as tool-less upgrades (even the PSU can be replaced without tools), a USB diagnostics port and so on (these features are described in our P900 review). What is different is that the P710 and the P910 workstations can also be optionally equipped with an Intel 802.11ac card (2x2, 2.4 GHz/5GHz + Bluetooth 4.0) as well as Intel’s 'Alpine Ridge' Thunderbolt 3/USB 3.1 controller (which will support one USB type-C port). The addition of TB3 and a wireless module suggests that we are dealing with updated motherboards inside the new workstations, but Lenovo yet has to confirm this.
    Lenovo’s ThinkStation P710 and P910 workstations should hit the market in the coming weeks after the company finishes their certifications with key ISVs. Prices of the new systems will start at $1600 – $1800, but configurations with two CPUs, professional graphics cards, a lot of RAM and multiple storage devices will easily cost $15000 and higher.
    Gallery: Lenovo Upgrades 2-Way ThinkStation Workstations with Intel Xeon E5 v4 CPUs




    More...

  4. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,809
    Post Thanks / Like
    #5884

    Anandtech: The Cougar Attack X3 Mechanical Keyboard & 450M Gaming Mouse Review

    Today we're taking a look at the Attack X3 mechanical gaming keyboard and the 450M optical gaming mouse from Cougar. The Attack X3 is an advanced, programmable mechanical keyboard without many extra functions and RGB lighting, while the 450M is a relatively simple programmable mouse with a high precision optical sensor. Both products are aiming at the center of the gamers market, targeting those that do want advanced and high quality products but are also considerate of their budgets.
    Of all the peripherals we look at, those meant for the "average gamer" are perhaps the most interesting. Unlike flagship products where pricing is rarely a concern and virtually every feature can be (and is) added, for products aimed at the mass market manufacturers need to balance functionality and design with the needs of the market, and ultimately cost concerns. So compared to Cougar's outstanding 700 series products, it will be interesting to see if Cougar can maintain this level of quality with the X3 and M450. If so, then they should have a winner on their hands.



    More...

  5. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,809
    Post Thanks / Like
    #5885

    Anandtech: NVIDIA Announces Quadro M5500 For Notebooks, Details Professional VR Plans

    Taking place this week is NVIDIA’s annual developer conference, the GPU Technology Conference. A show for everything NVIDIA across all of the company’s major divisions, this year’s show stands to be especially important to NVIDIA. With their Pascal architecture set to launch this year, I expect we’ll be hearing more about it, and meanwhile tomorrow’s keynote is on the same day as the HTC Vive launch, underscoring the importance of VR.
    In fact it’s VR where we’ll start today, as NVIDIA’s professional visualization group is releasing some news a day ahead of tomorrow’s formal kickoff. NVIDIA is pitching products and technologies for both consumer and professional use, and today they are detailing their professional VR plans for the show. To kick things off, they are announcing a new notebook Quadro video card, the Quadro M5500.
    NVIDIA Quadro Mobile Specification Comparison
    Quadro M5500 Quadro M5000M Quadro M4000M Quadro K5100M
    CUDA Cores 2048 1536 1280 1536
    Boost Clock 1140MHz ~1050MHz N/A ~800MHz
    Memory Clock 6.6GHz GDDR5 5Gbps GDDR5 5Gbps GDDR5 3.6Gbps GDDR5
    Memory Bus Width 256-bit 256-bit 256-bit 256-bit
    VRAM 8GB 8GB 4GB 8GB
    FP64 1/32 1/32 1/32 1/24
    TDP 150W 100W 100W 100W
    GPU GM204 GM204 GM204 GK104
    Architecture Maxwell 2 Maxwell 2 Maxwell 2 Kepler
    VR Ready Yes No No No
    Much like its consumer counterpart, the GeForce GTX 980 for Notebooks, the Quadro M5500 is a new SKU for high-end desktop replacement style laptops that incorporates a fully enabled GM204, and an even higher TDP to support it. The significance of this new Quadro module – beyond now being the fastest mobile Quadro – is that it’s the only mobile Quadro fast enough to meet HTC and Oculus’s hardware recommendations, as the M5000M falls short. As a result this launch enables NVIDIA and their partners to sell VR-ready professional laptops alongside their existing consumer laptops.
    As far as technical specifications go, we are looking at a fully enabled GM204 GPU with a boost clock of roughly 1140MHz. The GPU is paired with 8GB of GDDR5, clocked at 6.6Gbps. Altogether this should be a major step up from the Quadro M5000M, which shipped with one-quarter of its GM204 GPU disabled. In terms of performance this means the new M5500 should offer a better than 40% performance boost in shader or texture-bound scenarios, along with 32% more memory bandwidth.
    The trade-off, of course, is that as performance approaches desktop levels, so have the power requirements The M5500 has a TDP specification of 150W, a full 50% higher than the 100W TDP of the M5000M. Consequently this is a video card for true desktop replacement style laptops, as it takes a very large cooling system to handle so much heat.
    The first laptop to ship with the M5500 will in turn be MSI’s WT72 mobile workstation, a 17.3”, 8.4lb laptop further paired up with Intel’s Mobile Xeon CPU. Meanwhile, though MSI’s specifications don’t specifically make note of it, I expect that the Quadro M5500 version of the laptop will not feature NVIDIA’s Optimus technology, as the card needs to be directly wired to the HDMI port in order to support VR headsets.
    Overall the Quadro M5500 will be the backbone of NVIDIA’s mobile VR efforts for this generation, as it stands alone in NVIDIA’s VR-capable mobile Quadro lineup. The company will be shipping the video card in May, and while a price has not been announced, expect it to be priced like a high-end professional mobile GPU.
    Doubling Down On VR: Quadro VR Ready & More

    Along with the launch of the Quadro M5500, NVIDIA is also detailing their overall plans for VR support in the professional market. As with the consumer side of the company, the professional side sees VR as being a major engine of growth for the GPU business, as VR headsets require a very high minimum level of rendering performance. At the same time the company is looking to entice VR software developers with their VR toolsets.
    On the hardware side of matters, the company and its partners will be extending its VR Ready program into the professional desktop market. Dell, Lenovo, HP, and other vendors will be shipping VR Ready certified workstations in the coming months. The minimum required here is the Quadro M5000 – NVIDIA’s second-tier desktop Quadro card featuring a fully enabled GM204 GPU – while various Quadro M6000 configurations will offer additional power. For those applications that support VR SLI, at the top of the stack is the recently launched 24GB M6000 in SLI.
    Along with the immediate performance requirements for driving a VR headset, NVIDIA is hoping to sell developers on the utility of additional performance when working on VR development, as content creation by its very nature lacks the rigid performance optimizations and boundaries of consumer content. Arguably the Holy Grail here is being able to do VR development in VR, and while current video card performance doesn’t quite allow that on a full scape, at least parts of the process can be done in VR today. At the same time it doesn’t look like we’ll see an NVIDIA answer to the recently announced AMD Radeon Pro Duo, however given NVIDIA’s dominant position in the professional GPU market, it’s not all that surprising.
    For professional VR, NVIDIA is also reminding developers that VR Works is available for professional applications as well as consumer applications. There are not any professional-specific VR technologies in VR Works, so the functionality is otherwise identical to what we’ve seen before, with application developers getting access to VR SLI, various shading techniques including multi-res shading, and support for asynchronous timewarp as implemented in the latest VR headset SDKs. As with consumer applications, for a lot of developers this will be more about what applications support these technologies, as they use licensed engines and applications to develop their final products.
    Finally, as far as professional VR use cases go, we should see quite a bit of it on display at the show this week. NVIDIA has a whole section of the showfloor dedicated to VR – what they’re calling VR Village – where developers will be showing off how they’re using VR. Common themes this year will include collaboration, medical, and prototyping/development (think VR CAD). Though VR as a whole has primarily been focused on consumer uses – and there will be plenty of that as the show as well – NVIDIA seem keen on showing professional users that VR can be used to augment what they do as well.

    More...

  6. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,809
    Post Thanks / Like
    #5886

    Anandtech: AMD Releases Radeon Software Crimson Edition 16.4.1 Hotfix - Game Ready Su

    Another week another driver release, at least that feels like the way of it now. While some of AMD’s driver releases lately have been light on features and fixes, as a developer by day, I can applaud the steady march of progress.
    The Radeon Software Crimson Edition 16.4.1 Hotfix brings the driver version to 16.15.2211.1001. With a smaller list of fixes this driver version contains we get a resolution for flickering that may happen while playing Hitman in DirectX 11 with high shadow settings in game. Also, it appears that in some games the DirectX 12 frame rate capping issue may still remain, as they say that this exists in some applications and has been resolved. There was mention and resolution of a similar issue last month locking DirectX 12 applications to the refresh rate of the display, but details on whether this issue is separate or more of the same is unclear.
    Alongside continued support for both the Oculus Rift and HTC Vive, we have now been given game ready driver support for the brand new title Quantum Break. With AMD Claiming up to 35% faster performance on a Radeon R9 Fury X when compared against Radeon Software Crimson Edition 16.3.2. AMD also makes clear in the footnotes of this drivers release notes that this was tested on a Windows 10 system running an Intel i7-5960X with 16GB of RAM and resolution of 3840x2160. While gains of 35% are impressive, the scores they share for Radeon Software Crimson Edition versions 16.3.2 and 16.4.1 are 16.693 and 22.663 respectively. If these are scores I don’t know how they are derived and can’t speak on them. Though if they are frame rates, then that level of performance can cause the margin of error can swing wide. This doesn’t represent a real use case either, since nobody is going to settle for framerates that low on an R9 Fury X. Despite this I must make clear that I appreciate the disclosure, and having this information gives performance numbers provided by AMD much more meaning and validity than we would have had without them.
    As always, those interested in reading more or installing the updated hotfix drivers for AMD’s desktop, mobile, and integrated GPUs can find them either under the driver update section in Radeon Settings or on AMDs Radeon Software Crimson Edition download page.


    More...

  7. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,809
    Post Thanks / Like
    #5887

    Anandtech: AMD Pre-Announces Bristol Ridge in Notebooks: The 7th Generation APU

    For anyone tracking AMD’s family of Accelerated Processing Units (APUs), the last generation Carrizo was launched back in the middle of 2015. This was based on the fourth iteration of the Bulldozer module design (the cores are codenamed Excavator), focused entirely on notebooks at a 15-35W power window. Today marks the pre-announcement of the 2016 line, Bristol Ridge, for notebooks.
    The main difference between Bristol Ridge and Carrizo is the implementation of a DDR4 memory controller, along with minor microarchitecture manufacturing tweaks. We’ve already seen Carrizo/Excavator under DDR4 in the embedded space, and AMD is claiming that this latest generation of Bristol Ridge offers up to at 50% CPU improvement over Kaveri, launched in 2014, and Bristol Ridge is some 10% over Carrizo due to the new memory support.
    Despite AMD quoting a 50% gain in Cinebench compared to Kaveri, AMD’s strengths in the notebook line are partly due to the integrated graphics, which historically gets a boost from faster memory. Although this depends on the underlying design by the OEM, as we detailed in our Carrizo OEM overview that pointed the finger at base single channel memory designs being the norm at retail, rather than dual channel.
    Today is a pre-announcement, which means that details are very thin. The reason this is not a full launch lies in one of AMD’s OEM partners, HP, announcing a new notebook at GTC this week based on AMD’s Bristol Ridge designs. HP is AMD’s biggest partner in notebooks, and is launching the HP Envy x360 15-inch variant using a Bristol Ridge part under the AMD FX naming scheme.
    We’ll go deeper into the Envy x360 announcements in a separate news post. But this pre-announcement means that AMD are happy to talk about high level details such as 3DMark performance compared to Carrizo and Intel, FreeSync sypport, Dual Graphics, DirectX 12 and so on, but we will have to wait until Computex 2016 when we’ll get the full breakdown of the APU advancements, SKU names, clock speeds and where these APUs will be implemented.
    As mentioned in previous news posts, AMD on the desktop has confirmed that Bristol Ridge and the upcoming Summit Ridge APUs featuring a brand new microarchitecture design will share a platform. We could extrapolate (as others have done so) to suggest that this notebook platform will also be the one supporting Summit Ridge on notebooks when it is released, however AMD has not officially confirmed this for this pre-announcement. We will have to wait for Computex for more details.
    Further Reading:

    AMD Launches Excavator APUs for Embedded with DDR4 Support
    Who Controls the User Experience? AMD’s Carrizo Thoroughly Tested
    AMD Launches Carrizo: The Laptop Leap of Efficiency and Architecture Updates
    Gallery: AMD Pre-Announces Bristol Ridge in Notebooks: The 7th Generation APU




    More...

  8. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,809
    Post Thanks / Like
    #5888

    Anandtech: HTC Vive Launch Day News & First Impressions

    After what seems like entirely too long a wait, the consumer VR headset business is finally in full swing. Last week we saw the launch of Oculus’s Rift, and now this week we have the second of the major VR headsets launching, the HTC Vive. A joint development project between the reclusive game developer Valve and mobile manufacturer HTC, the Vive offers a second take on what a first-generation consumer VR headset should look like and be capable of doing.
    Today is the official launch of the headset, as HTC begins delivering the headset to pre-order customers. Unfortunately the headset is in short supply, and if you didn’t pre-order one back at the end of February you’re going to be waiting a bit for a chance to get one, as HTC is not taking further pre-orders at this time. Nor at this point it’s not clear what’s going to be faster: waiting for pre-orders to open, or trying to snag a retail unit in May.
    In any case, alongside the launch of the Vive, there are also driver updates to talk about. AMD pushed out a new Radeon driver release last night – 16.4.1 – and while their previous release already supported the Vive, this latest release in turn offers the latest support for the headset. As of press time there hasn’t been an NVIDIA driver update, but I wouldn’t be surprised if we see one.
    HTC Vive Retail: First Impressions

    While a full review of the Vive is still in the works for the future, I wanted to offer some impressions of using the final, retail headset at GDC 2016. There the Vive was on demonstration at several locations, including Epic Games’ booth, and Valve’s own ballroom center. Although my time with the headset was somewhat limited – VR headsets were by far the most popular attraction at GDC and as many people wanted to see them as possible – I did have a chance to check out the headset with both Valve and Epic’s demos.
    Starting then with the Valve demo, the company was at the show to showcase the headset and their accompanying The Lab game, a collection of mini-games more or less set in the Portal universe. The version of The Lab the company was showing off was incomplete – only four of the mini-games were available – so I can’t comment on the complete package. But if you’re familiar with mini-game collections such as early Wii titles, then you should have an idea of the kind of smorgasbord experience Valve is going for here.
    HTC Vive Specifications
    Display 2x OLED
    Resolution 2160x1200 (combined)
    1080x1200 (per eye)
    Refresh Rate 90 Hz
    FOV ~110°?
    Sensors Gyroscope
    Accelerometer
    Laser Position Sensor
    Position Tracking 2x "Lighthouse" Base Stations
    Audio BYO, Earbuds Included With Kit
    Controls 2x Vive Motion Controller
    Launch Date 04/05/16
    Launch Price $799
    But before we get too far into The Lab, let’s talk about the Vive headset itself. The units Valve was using at GDC were not any different from the first Vive Pre units announced at CES 2016 in January, and while I don’t have confirmation of this, at this point I would be surprised if the retail units are physically any different from the Pre units, as I wasn’t able to spot any differences. In either case, my initial impression is that while Valve isn’t doing the hardware manufacturing themselves – HTC getting that honor – it does at times show that level of polish that Valve products are known for.
    With the Vive, Valve is looking to push room-scale VR right off the bat, which sets it apart from the Oculus Rift and its initial, Xbox One controller-based experience. To that end the Rift ships with the headset itself, two base stations for tracking the user, and two handheld motion controllers. What you won’t find with the Vive is any kind of gamepad, and while Valve isn’t preventing developers from doing more traditional games – the space sim Elite: Dangerous already supports the Vive – they are clearly trying to get developers into the mindset of working with motion.
    In terms of specifications and capabilities, the Vive headset sports a pair of 1080x1200 OLED panels – one per eye – for a combined 2160x1200 experience that runs at 90Hz. As a first-generation headset The Vive is otherwise quite similar to the Oculus Rift in a lot of fundamental ways, and I wouldn’t be surprised if both headsets are sourcing OLED panels from the same manufacturer. This parallel development between the two headsets means that there’s a lot in common between the two, but there are also some notable differences. And with apologies to Valve and HTC here, I’m going to make a lot of comparisons to the Rift as comparing and contrasting the two helps to understand the strengths and weaknesses of both devices.
    The display itself is of course paramount, and a lot of my experiences with the Rift are applicable to the Vive as well. This is a first-generation headset and clearly so; the screen-door effect is not strong, but depending on the kind of content being used, it is present at times. Valve’s own demos tend to be more abstract, which hides this fairly well. But games with more details, such as the Star Wars Trials on Tatooine tech demo, fare a bit worse.
    There is going to be a lot written (and even more debated) about which headset is better, and while I won’t make a definitive claim until I can fully review both headsets, I will say that the optics alone are very different. The Vive and Rift are clearly doing different things with their lenses, and in my experience so far I’ve found that the Vive is more prone to chromatic aberrations than the Rift. Reading text, for example, is harder on the Vive, and it seems like the sweet spot is smaller, to the point that you need to get any text straight in front of you for best results.
    However I’m unsure whether this is a headset configuration issue, or just something inherent to the Vive. There’s also the matter of whether having a blurrier experience at the peripherals is more useful or not; in my limited experience with the Vive, it seemed to work fine. Unfortunately I don’t have a good feel for the field of view yet, as I really need the two headsets side-by-side to make that determination.
    At this point I feel like I need more time on the Vive to get a great feel for how well it’s handling head-tracking, but I did not notice any immediate problems using the headset. In every Vive demo I’ve used, head tracking has registered to my conscious mind as instant, and I’ve had little trouble adapting (sometimes to the chagrin of the booth operator, as it’s similarly easy to get a bit too into the game and start drifting towards their expensive monitors with my long reach). However re-acclimating to the real world after using the Vive was a bit more of a challenge than it was with the Rift; I needed longer to get my proper sense of depth perception back. But admittedly I’m not sure if this is a Vive thing or the fact that it was after 4pm in the afternoon when I finally wrapped up with Valve’s demo.
    Meanwhile in terms of fit the Vive fits more or less like the Rift, which is to say that it’s a sturdy fit that can stretch to accommodate even my large head. Weight also seems to be good, as the headset doesn’t seem to pull too hard or get tiring to wear even with its front-heavy nature. Meanwhile in a notable difference from the Rift, the Vive does not include any kind of built-in headphones. This means that the user will need to bring their own audio gear, which opens up a lot of options for sound – everything from earbuds (which are included with the retail kit) to high-end headphones can be used here – but it does mean that putting on the Vive is a bit more involved since you need to put on the headset and then the headphones (and then the motion controllers).
    Speaking of the motion controllers, this is without a doubt the biggest differentiator from the Rift at this time, as it’s the motion controls that unlock the room-scale experience Valve is pushing. If I can make another Wii analogy here, in-hand the Vive motion controllers have a similar feel, but of course with the Vive’s far better tracking and sensors. The motion controllers feel natural to use, and while it takes a bit of time to get used to using them without being able to see the buttons, it’s a quickly acquired skill. Having used both this and the forthcoming Oculus Touch controller, I will say that they have different feels to them and it will be noticeable; it’s a Wii-like wand versus what I can only describe as a more brass-knuckles kind of feeling to the Oculus.
    Getting back to the software then, throughout my experience with the Vive, I keep gravitating back to Wii comparisons, as coupled with the controller that is the most intuitive way to describe the experience. It’s highly accurate position tracked Wii remotes with a VR headset. There’s going to be a lot of experimentation here, both inside and outside of Valve, as developers get a feel for how motion controls need to work in a VR environment.
    The Lab of course is Valve’s internal effort to work this out with a tried and true mini-game collection. The GDC demo was composed of four experiences (I hesitate to call all of them games): what’s best described as a sight-seeing tour around Washington’s Vesper Peak, a slingshot game based around lobbing Portal 2 personality cores at distant stacked crates and exploding boxes (think first-person Angry Birds), an archery game defending yourself from the 2D cardboard cutout-like humans in various Aperture Science media, and finally a bullet hell shoot ‘em up that has no discernable Portal tie-in, in which your right hand is the ship and the action takes place in 3D. All of these demos are clearly meant to demonstrate a different concept of motion controls, and while I only had a few minutes with each, the shoot ‘em up was probably the most fun (and most likely to get someone a punch in the face). There are more mini-games in the full version of The Lab, and while I can’t say for sure whether any one game will be the stand-out, of what I’ve seen so far this looks like a fun and competent collection of games, and a good starting point for future games for Valve and other developers.
    Speaking of other developers, Lucasfilm’s ILMxLab also had a presence at GDC, teaming up with Valve and others to show off Star Wars: Trials on Tatooine, which is part of their own experiments with interactive story telling. The demo is somewhere between a game and a real-time rendered VR movie, with a fairly lengthy cut-scene leading into a few minutes of VR motion-inspired gameplay, the highlight of which is deflecting blaster bolts at Stormtroopers. ILM was very clear about this being an experiment, and this does show at times, but it’s an example of one possible way we could see VR gaming and story telling go in the future.

    Finally, Epic Games’ also had Vives on-hand at their booth to show off Trials on Tatooine along with the latest build of their Bullet Train VR demo. A full interactive experience, Bullet Train is the most combat-focused of all of the VR demos I’ve tried so far, having the user teleport around and use various guns to take out an attacking force inside of a train station. Between physical attacks, shooting, and picking bullets out of the air and throwing them back at the attackers, Bullet Train managed to keep things varied.
    A common theme in all of this is that developers are still in an experimental stage for game design, and while today’s launch of the Vive hardware marks the true beginning of consumer VR hardware, the software side is still in its infancy. Valve’s vision of room-scale motion controlled VR is going to take some time to evolve, but I’m curious to see where we end up in the years to come. In the more immediate future, it will be early titles like the pack-ins Fantastic Contraption and Job Simulator that set the initial pace for Vive VR gaming.
    Zotac ZBOX MAGNUS EN980: A Small PC For Room-Scale VR

    Finally, while checking out the Vive I also had a chance to stop by and quickly chat with Zotac, who was close by showing off their upcoming ZBOX MAGNUS EN980 mini-PC. Besides Valve’s demo area being a high-traffic location that’s great to get some attention, Zotac’s planning also highlights the fact that room-scale VR as Valve is planning for does not necessarily mesh well with traditional tower-style PCs. Going back to the Wii as an example, in most homes I suspect the only place large enough (and safe enough) for the upwards of 15ft x 15ft experience that Valve is planning for is the living room, and while this doesn’t preclude using a tower, it’s perhaps not the best environment for it. To that end Zotac is showing off their EN980, which they are specifically pitching for use as a small form factor VR machine.
    ZOTAC ZBOX MAGNUS EN980 Specifications
    Processor Intel Core i5-6400
    Memory DDR3L SO-DIMMs
    Graphics NVIDIA GeForce GTX 980 For Notebooks
    Storage M.2 SSD, one or two 2.5" SSDs/HDDs
    Networking 2x Gigabit Ethernet
    IEEE 802.11ac Wi-Fi and Bluetooth
    Audio Capable of 5.1/7.1 digital output with HD audio bitstreaming (HDMI)
    I/O USB 3.0 (Type-A), USB 3.1 (Type-C), SD card reader, HDMI
    Operating System Compatible with Microsoft Windows 7, 8, 8.1 and 10
    Inside the EN980 is an Intel Core i5-6400, with is paired with NVIDIA’s GeForce GTX 980 for Notebooks, their most powerful mobile video card. This is especially notable as this is the only mobile GeForce card fast enough to meet Valve’s recommended system requirements for the Vive. With the EN980, Zotac is essentially aiming to develop a console-sized PC that would be at home next to the traditional consoles, except suitably powerful enough for VR.

    Zotac's EN980 Next to a Steam Link
    The device as they’ve been showing it off is still a prototype (though near-final), with the final unit expected to launch at Computex. While the price will undoubtedly be high given the components they’re using, I do feel that Zotac has the right idea for how small form factor PCs will be relevant to the room-scale experience. However with the current prototype I do fear that Zotac has overlooked I/O port placement – specifically, the HDMI port is on the back instead of being on the front for easily plugging in a Vive – and there is talk of including a break-out box of some kind to make these ports accessible from the front. Overall I wouldn’t be surprised to see the EN980 and similar designs become a small but noticeable niche for VR, as if Vive sales are any indication, there is a need for VR-capable machines for the living room.

    More...

  9. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,809
    Post Thanks / Like
    #5889

    Anandtech: HP Unveils Spectre: The World’s Thinnest Laptop

    HP has announced its new Spectre laptop - the world’s thinnest 13.3” notebook. Despite its miniature size, the Spectre uses Intel’s Core i5/i7 microprocessor, a PCIe-based SSD, a full-size keyboard as well as an advanced audio system developed by Bang & Olufsen. To build its new notebook, HP had to use a number of innovative technologies, although to make the system so small and thin it had to sacrifice upgradeability and serviceability. On top of it all, the HP Spectre comes across as very price competitive.
    The HP Spectre is just 10.4 mm thick and weighs 1.10 kilograms (2.45 lbs). To build it, HP had to use CNC-machined anodized aluminum and combine it with carbon fiber, a custom display panel with Gorilla Glass 4 as well as special recessed piston hinges. To maximize battery life (which is officially listed at 9 hours 30 minutes), HP had to use two types of batteries of different shapes inside its Spectre laptop, giving four cells in total. As part of a premium line, it comes in black and gold finish as well as with a different varient of HP’s logo, emphasizing the position of the system.
    The HP Spectre laptop is based on the Core i5-6200U (two cores with Hyper-Threading, 3 MB L3 cache, Intel HD Graphics 520) or the Core i7-6500U (two cores with Hyper-Threading, 4 MB L3 cache, Intel HD Graphics 520), but the manufacturer does not disclose exact clock-rates of the CPUs. HP notes that the microprocessors may not necessarily work at their default frequencies all the time, but indicates that the chips are still considerably faster than the Core M processors used inside the Apple MacBook. These could either be the full fat 15W parts, or the models could be in cTDP down mode.
    Comparison of Ultra-Thin Notebooks
    HP Spectre Apple MacBook (2015) ASUS UX305UA
    Screen Resolution 1920×1080 2304×1440 1920×1080
    3200×1800
    CPU Intel Core i5-6200U
    Intel Core i7-6500U
    Intel Core M 1.1 GHz
    Intel Core M 1.2 GHz
    Intel Core M 1.3 GHz
    Intel Core i3-6100U
    Intel Core i5-6200U
    Intel Core i7-6500U
    Graphics Intel HD Graphics 520
    (24 execution units)
    Intel HD Graphics 5300 (24 EUs) Intel HD Graphics 520
    (24 execution units)
    RAM 8 GB (LPDDR3?) 8 GB LPDDR3 4 GB LPDDR3
    8 GB LPDDR3
    Storage 256 GB SSD
    512 GB SSD
    256 GB SSD
    512 GB SSD
    128 GB SSD
    256 GB SSD
    512 GB SSD
    Wi-Fi Wi-Fi 802.11ac Wi-Fi 802.11ac Wi-Fi
    USB 3.1 3 × Type-C - -
    3.0 - 1
    × Type-C
    2 × Type-A
    2.0 - - 1 × Type-A
    Thunderbolt 2 × Thunderbolt 3 - -
    HDMI - - micro-HDMI
    Other I/O Microphone, stereo speakers, audio jack
    Thickness 10.4 mm up to 13.1 mm 16 mm
    Weight 1.10 kilograms 0.92 kilograms 1.30 kilograms
    Price 256GB:
    $1170 (Core i5)
    $1250 (Core i7)
    256GB:
    $1300 (1.1 GHz)
    $1600 (1.3 GHz)
    $750 - $1200
    To cool-down the CPUs, HP uses its so-called 'hyperbaric cooling technology', which features two ultra-thin fans, a heat-pipe as well as a special copper radiator. The fans intake cool air from the outside and create significant air pressure inside the chassis to blow away hot air. The company does not disclose how loud such cooling system is, but implies that noise levels created by the Spectre should be comfortable.
    The Spectre is equipped with 8 GB of memory (we believe LPDDR3) that is soldered to the motherboard, which means that it cannot be upgraded. The notebook is also equipped with a standard 256 GB or 512 GB SSD, but HP does not release performance figures or which model this is. The company also says that since the design of its laptop is “sealed”, it is impossible to upgrade the storage drive at home, suggesting a soldered down version of an M.2 drive. Keeping in mind that it is impossible to add storage to the majority of tablets and 2-in-1 hybrid PCs, it is not surprising that HP chose to limit upgrade capabilities of its Spectre notebook. When creating Spectre, HP focused mostly on making an extremely thin design, but not to offer capabilities for further modernization.
    Due to thin design and relatively limited battery capacity of the HP Spectre, the manufacturer did not have much choice when it comes to display panels. HP uses a 13.3” full-HD (1920×1080) IPS panel with 300 nits brightness, which covers 72% NTSC. According to HP, the panel provides the right balance between resolution, power consumption, the brightness, and the price. Nevertheless, the most important feature of the panel is its thinness. The whole display assembly is just 2 mm thick.
    The HP Spectre also comes with three USB type-C ports, two of which support Thunderbolt 3. These will be powered by Intel's Alpine Ridge controller, though HP has not disclosed how many are in use (either one controller for both ports, or two controllers for one each). The TB3-enabled ports can be used to connect external displays as well as Thunderbolt 3 peripherals. It is noteworthy that since HP does not talk about connecting things like external graphics adapters to its Spectre notebooks, this feature may not be enabled right now.
    The notebook is also equipped with a keyboard that has 1.3 mm travel as well as a trackpad with full-travel etched glass. HP compares it Spectre to Apple’s MacBook and claims that the keyboard, as well as the touchpad of the MacBook, are less comfortable to use than those on the HP machine, something that needs to be verified by independent reviews.
    Communication capabilities of the HP Spectre include a Wi-Fi and Bluetooth module that supports 2.4 GHz and 5 GHz networks. The module supports only one “slot” antenna, therefore, its performance may be lower when compared to other modern notebooks, another thing that HP may have had to sacrifice for portability. HP also does not specify if the Wi-Fi is 802.11ac or 802.11n only.
    This month HP will begin to sell its Spectre laptop at hp.com as well as at Best Buy in the U.S. The Spectre based on the Intel Core i5-6200U and equipped with 8 GB of RAM and 256 GB SSD will cost $1169.99, whereas the model featuring the Intel Core i7-6500U, 8 GB of memory and a 256 GB SSD will be priced at $1249.99. There is no word on the cost of the 512GB models as of yet, although we expect another $100-$150 on top of that. In May, the manufacturer plans to start selling its thinnest laptop in other countries, but it does not reveal the list of countries or recommended prices. In the US, HP will also offer limited edition systems co-developed with famous designers combining a golden finish and Swarovski crystals. In addition, the company will also sell accessories (a Bluetooth mouse, a carry bag and a leather sleeve) that match the design of the Spectre notebook.


    More...

  10. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,809
    Post Thanks / Like
    #5890

    Anandtech: The NVIDIA GTC 2016 Keynote Live Blog

    We're here in sunny San Jose for the 2016 edition of NVIDIA's annual GPU Technology Conference (GTC). With CEO Jen-Hsun Huang presenting, we're expecting a Pascal-heavy presentation about the next year of major initiatives from NVIDIA on the HPC and professional side, and possibly some consumer news as well.


    More...

Thread Information

Users Browsing this Thread

There are currently 53 users browsing this thread. (0 members and 53 guests)

Tags for this Thread

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
Title