Page 583 of 1210 FirstFirst ... 834835335585735785795805815825835845855865875885936086336831083 ... LastLast
Results 5,821 to 5,830 of 12096

Thread: Anandtech News

  1. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,809
    Post Thanks / Like
    #5821

    Anandtech: VRScore: Crytek and Basemark Announce New Benchmark Release

    Virtual Reality has definitely been playing the slow and steady race for a while now. Thankfully, with the approach of consumer VR headsets going on sale, more of us will be able to see the fruits of years of hard work. One very important aspect of any new consumer product is measuring its capability, and consequently the value. In the case of VR there are many pieces to the chain that can make or break the experience, so as a result Basemark and Crytek have taken that opportunity and teamed up to bring us VRScore, a new VR benchmark, which is being announced and made available to corporate customers today and to consumers later this year.
    Basemark and Crytek are collaborating with the goal to create a benchmark that shows how ready consumers are for VR, and the potential that VR could achieve. On one side we have Basemark, who brings their benchmarking pedigree, and on the other side is Crytek who have been a reference point in the world of game engines for over a decade. Last year they began work together on the VRScore benchmark, with the goal being to produce a real world VR benchmark based on a game engine currently in use for production of AAA titles. After being in development for a few months, Basemark and Crytek are ready to announce today that their VRScore benchmark will be available immediately for corporate customers. Consumers will be able to get their hands on the benchmark this June in both a limited free version and a fully functional Pro Version.
    With just over 3.5 years between us and the Kickstarter campaign for the Oculus Rift, we are finally on the cusp of enthusiast grade consumer VR coming to market. As has long been established, VR requires a high-resolution display running at a high refresh rate with low input latency. The VR chain can experience latency from the point the user gives input to the application processing that input, then from the application to output the data to the screen. In the mix is also the audio configuration, which by default needs to be positional and reactive. In the case of VR it just so happens that all of this is rolled into one package that isolates you from the outside world, so naturally any weak links in the chain quickly lead to a poor experience. For that reason, VR can require a particular care when preparing a system that will run a VR experience. In both the case of the HTC Vive and the Oculus Rift, the minimum requirements come out to what in today’s market is likely to be around an $800 custom build or more. More than a third of which will have to go into the GPU, which both systems have recommended a NVIDIA GTX 970, AMD R9 290, or better.
    The VRScore benchmark supports all major VR headsets. Including the HTC Vive, the Oculus Rift and OSVR. By being built on CryEngine 3, there is support for DirectX 11 and DirectX 12. Along with that, VRScore provides tests for measuring a plethora of metrics that are important for the VR experience: the latency factor, measuring the impact of the HMD and VR audio on the systems performance and also an analysis of the video experience. When these tests are completed then performance numbers for the HMD, GPU, and CPU are given. With these scores, an online database will be provided to see how a given machine compares with friends and others interested in VR.
    After years of us all watching the industry work away at the challenges of a growing platform, we finally are approaching products we can buy off the shelves. Basemark and Crytek have been working to provide us with a benchmark to measure the readiness of our computers for the rigors of VR. While VR will be a very demanding endeavor for quite a long time, VRScore is an available option to measure the readiness of an existing or future system for VR gaming. We are also currently working with a few companies to provide other angles for VR metrics as well. VR has the stage set for a lot of growth in the future, but nothing can be controlled without first being measured. VRScore and other utilities like it will be necessary to find the value in VR systems this year and in years to come.
    Related Reading
    AMD Announces Radeon Pro Duo: Dual GPU Fiji Video Card For VR Content Creation
    EVGA Begins Selling "VR Edition" GeForce GTX Video Cards for VR Gaming Rigs
    HTC Vive Will Be Launching In April Priced At $799
    Stepping Into the Display: Experiencing HTC Vive
    Testing the HTC (Re) Vive with Steam VR
    Oculus VR Reveals Retail Price of Its Virtual Reality Headset: $599
    Oculus Rift Controllers, VR Games, And Software Features Announced
    Oculus VR Posts Recommended System Specs For Rift - Outlines Platform Goals
    Oculus Demos Crescent Bay and VR Audio


    More...

  2. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,809
    Post Thanks / Like
    #5822

    Anandtech: Lattice and MediaTek to Collaborate On Reference Smartphones For SuperMHL

    Lattice Semiconductor and MediaTek have joined forces to create reference smartphones with 4K video outputs using USB Type-C connectors and cables. The outputs will use the superMHL technology and will thus be compatible with various other superMHL devices such as TVs. The companies have already built a reference phone that supports a superMHL output over USB-C using previously announced chips from Lattice and hope that makers of smartphones will embrace the solution.
    The USB Type-C technology and connectors are rapidly gaining traction across many industries. A number of modern notebooks, mobile devices and even desktops come with USB type-C ports. A good thing about USB-C is that it can be used to transfer alt mode data using a variety of protocols, including DisplayPort, Thunderbolt 3 and superMHL. Lattice (which acquired Silicon Image, a major developer of the MHL technology, about a year ago) clearly wants to capitalize on its expertise in the field of MHL and USB technologies by offering a reference implementation of single-lane superMHL over USB-C cables and connectors.
    The simplified single-lane implementation of the superMHL used here for phones supports transmission of video in 4K (3840x2160) resolution with 60 fps and deep color over USB-C or even traditional MHL cables.
    USB Type-C Alt Mode Display Standard Comparison
    superMHL DisplayPort 1.3
    Resoluion 4Kp60, 4:4:4 @ 24bit 4Kp60, 4:4:4 @ 24bit
    Type-C Lanes Required 1 2
    Image Compression "Visually Lossless" (Lossy) No Compression
    TV Interface superMHL-over-HDMI (Passive)
    HDMI (Active Conversion)
    HDMI (Active Conversion)
    Power Charging USB-PD or MHL (legacy) USB-PD

    The superMHL over USB-C reference implementation by Lattice and MediaTek involves the Helio X20 system-on-chip (two ARM Cortex-A72, four Cortex-A53 at 2 GHz, four Cortex-A53 at 1.4 GHz, ARM Mali T880 MP4 graphics core, dual-channel LPDDR3 memory controller), the Sil8348 MHL transmitter as well as the Sil7033 port controller (which sets up MHL alt mode on USB-C and supports power data objects, which are needed to charge the phone). The superMHL implementation by Lattice and MediaTek supports best of both technologies’ worlds: 10 Gbps USB 3.1 transfer rate, 4Kp60 video, power delivery (for up to 100W of power), ability to connect to TVs with MHL using appropriate adapters. What is notable is that thanks the to the Sil7033 chip, simultaneous MHL and USB 3.1 connectivity is also supported. Lattice also offers Sil7013 and the Sil9396 port controllers for docks and accessories to establish MHL alt mode and convert MHL to HDMI transmission respectively.

    The creation of these reference devices is the latest salvo in the ongoing war Lattice is striking over the future of video out for mobile devices. MHL was decently common in previous generation devices, however with the switch to USB Type-C ports and the creation of alt modes, there is opportunity to start anew. Practically speaking, this is a war between superMHL and DisplayPort, which are the two major video out alt modes.
    One thing that will be crucial for enablement of 4K outputs using USB-C interconnection are quality USB Type-C cables. It is not a secret that there are cheap USB 3.1 Type-C cables that do not work as advertised and simply lack conductors, which can be repurposed. Such cables will not be able to support 4K output using USB-C. Cheap USB type-C cables that do not comply with standards are an industry-wide problem. If today many people may simply not notice issues with them because they are only used for data transmission with USB 2.0 hosts, in the coming years the problem will get much worse.

    Finally, Lattice and MediaTek did not reveal whether their UHD output over USB-C reference platform has so far been adopted by any makers of smartphones.
    Gallery: Lattice and MediaTek to Enable 4K Outputs on Smartphones Using USB-C




    More...

  3. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,809
    Post Thanks / Like
    #5823

    Anandtech: Qualcomm’s New SDK Enables Development of VR Apps on Snapdragon 820

    Qualcomm on Monday introduced its first virtual reality software development kit, designed for its Snapdragon 820 mobile SoC. The new tools will enable software makers to create programs that take advantage of Snapdragon 820’s graphics processing capabilities (i.e. Adreno) as well as built-in sensors. Qualcomm confirmed that in addition to smartphones and other mobile devices, the Snapdragon 820 will also be used inside VR headsets.
    The Samsung Gear VR platform, as well as Google’s Cardboard, have demonstrated that smartphones based on contemporary high-end mobile SoCs can be used to enable virtual reality headsets. While graphics processing performance of mobile SoCs lags behind modern desktop graphics by AMD or NVIDIA, they integrate numerous sensors and technologies which can crucial for virtual reality equipment. In fact, positive virtual reality experience requires not only high-quality visuals and surround sound but also the complete immersion of the user and a sense of physical presence. As a result, precise sensors to track user’s movements and minimal latency are very important. But to fully utilize capabilities of modern mobile SoCs, software developers need a right set of tools tailored for VR software. Also, given the secrecy around the internal GPU Adreno graphics solution and its microarchitecture, any set of tools that can assist with graphics/DSP manipulation are a good thing to have.
    Qualcomm’s Snapdragon VR SDK, which will be available in the second quarter, supports a number of technologies that simplify development of virtual reality applications, such as games, 360° VR videos and a variety of interactive education and entertainment apps.
    The Snapdragon VR SDK supports DSP sensor fusion, which allows developers to access high-frequency inertial data from gyroscopes and accelerometers via the Snapdragon Sensor Core. The software development kit also allows developers to use the Qualcomm Hexagon DSP for predictive head position processing.
    Usage of the Snapdragon VR SDK reduces latency by up to 50% by using asynchronous time warp with single buffer rendering for a rapid transformation of rendered images in 3D space. Qualcomm says that its Snapdragon 820 SoC features 18 ms motion to photon latency thanks to various enhancements.
    The Snapdragon VR SDK also brings support for stereoscopic rendering with lens correction, color correction and barrel distortion, something that should improve the visual quality of graphics and videos. According to Qualcomm, the Snapdragon 820 can render stereoscopic images in 3200x1800 resolution at 90 fps. In addition, the software development kit can help to generate menus that are readable in VR worlds thanks to UI layering.

    Finally, the Snapdragon VR SDK gives developers access to CPU, GPU, and DSP power and performance management in a bid to help them guarantee high and stable frame rates (90 fps) in low-power devices. Precise power management is also required to build sleek and lightweight VR headsets.
    While the launch of a special Snapdragon VR SDK is a significant step for Qualcomm in the field of virtual reality, what is really important is Qualcomm’s commitment to VR in general. The company claims that it developed the Snapdragon 820 with virtual reality in mind and it will continue to implement VR-specific technologies into its upcoming Adreno graphics cores, CPU cores as well as Hexagon DSPs. Keeping in mind that VR headsets will only get more complex in the coming years, all the technologies that Qualcomm manages to incorporate into its SoCs will be instrumental in improving the quality of VR content.

    For ecosystem enablement, Qualcomm will initially bring developers this VR SDK, and then also app development tools, device optimization tools, development platforms, and so on. In particular, Qualcomm claims that VR headsets based on the Snapdragon 820 are incoming, which will allow end-users to experience VR apps and content, whereas developers will be able to test their programs on commercial hardware.
    Gallery: Qualcomm’s New SDK Enables Development of VR Apps on Snapdragon 820-Based VR Gear




    More...

  4. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,809
    Post Thanks / Like
    #5824

    Anandtech: Sony to Start Selling PlayStation VR in October for $399

    Sony has published the price of its PlayStation VR virtual reality headset and confirmed its final specifications and launch timeframe on Tuesday. The company intends to start selling the unit in October for $399, which is slightly later than expected. While the product will be considerably more affordable than competing headsets from Oculus VR and HTC - though the sticker price does not include the required camera - its technical specs are somewhat behind its rivals.
    Specifications of VR Headsets
    Sony PlayStation VR HTC Vive Oculus Rift
    Display 5.7" OLED 2x OLED 2x OLED
    Resolution 1920x1080
    960x1080 per eye
    2160x1200
    1080x1200 per eye
    2160x1200
    1080x1200 per eye
    Refresh Rate 90 Hz, 120 Hz 90 Hz 90 Hz
    Sensors three-axis gyroscope, three-axis accelerometer over 70 sensors
    MEMS gyroscope, accelerometer
    6 degrees of freedom tracking
    3-axis rotational tracking
    3-axis positional tracking
    Position Tracking PlayStation Camera Laser Vive Base Stations Constellation system based on infrared sensors
    The Sony PlayStation VR head-mounted display (HMD) (CUH-ZVR1-series) features a 5.7” OLED display with 1920x1080 (960x1080 per eye) resolution, 90 Hz – 120 Hz refresh rate and approximately 100° field of view. The PS VR HMD is equipped with six-axis motion sensing system (three-axis gyroscope, three-axis accelerometer) as well as stereo headphones. Right now, Sony does not reveal many details about its VR headset, so, we do not know a lot of peculiarities of the HMD, such as display or motion to photon latencies. Nonetheless, based on what we do know about Sony’s VR headset, we can say that it uses a lot of custom-made components, which are optimized for virtual reality. Every PlayStation VR will come with a special processor unit, which will plug to the PlayStation 4 game console. Sony’s VR headset will connect to the processor unit using an HDMI cable, a PS4’AUX cable, as well as a stereo mini-jack.
    Sony does not disclose what is inside the processor unit, but it claims that it does 3D audio processing as well as enables multi-display capabilities. In particular, the processor unit can enable cinematic mode, which lets users watch content or play games (including currently available PS4 titles) on a large virtual 225” screen. In addition, the processor unit can show what happens in virtual worlds on TV screens (in mirroring mode) as well as show different content on TV and VR screens (in separate mode). The processor comes in at a relatively light 365 grams, so while Sony isn't disclosing much about the internals, it's unlikely to contain much in the way of high-powered hardware; perhaps a semi-custom processor of some kind. The processor unit will use three HDMI ports, a USB port and an AUX cable to connect to a TV, a PS4 and a PS VR.

    Each PlayStation VR headset will be bundled with the PlayRoom VR set of games by Sony Computer Entertainment Worldwide Studios. The PlayRoom VR suit will include six online virtual reality games.
    Sony claims that 230 game developers and publishers are working on 160 various PS VR software titles and 50 of them will be available already this year. Sony claims that such games as Eagle Flight, EVE: Valkyrie, Headmaster, Rez Infinite, Wayward Sky, RIGS: Mechanized Combat League, Tumble VR and Until Dawn: Rush of Blood may hit the market already in 2016.

    While the price of Sony’s PlayStation VR is not low and its bundle does not include the PlayStation Camera (around $50) needed to track the headset’s position as well as the PlayStation Move hand-tracking controllers (approximately $50 per unit), which are expected to be required for many games, it looks like Sony’s PlayStation 4 virtual reality platform will cost gamers less than competing PC-based VR platforms (albeit, at the cost of lower resolution). In order to try VR games on PS4, gamers will need to invest $399 in the PlayStation VR, $110 in the camera and controllers and $349 in the PlayStation 4 (around $859 in total, assuming they don’t have a PS4 and the aforementioned hardware). By contrast, Oculus Rift and HTC Vive headsets will cost $599 and $799, respectively, and will require a relatively powerful PC that typically starts at $949. It remains to be seen how the lower MSRP of Sony’s virtual reality platform will affect its popularity among gamers, and whether Sony can wring any further economies of scale out of production. Nonetheless, lower price will be Sony’s trump in its competition against Oculus VR and HTC.

    Sony will begin to roll-out its PlayStation VR from October 2016 in Japan, North America, Europe and Asia, where it will cost ¥44,980, $399, €399 and £349. The company said that it decided to delay the release of its PS VR from the first half of the year to October in order to ensure that it can make enough hardware units. Another reason to delay the product could be readiness of VR games. Sony needs to ensure that their quality is high and performance is solid, which is why it needed to buy some additional time for developers.

    Gallery: Sony to Start Selling PlayStation VR in October for $399




    More...

  5. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,809
    Post Thanks / Like
    #5825

    Anandtech: NVIDIA Announces GameWorks SDK 3.1

    Innovation is hard work. Doing work that has already been done elsewhere can be satisfying, but also annoying - no-one wants to reinvent the wheel every time. In the realm of 3D graphics, we are not limited to creating our wares from scratch - toolsets such as NVIDIA GameWorks are provided to developers allowing them to include advanced graphics rendering and physical simulation features into their products. The latest version, NVIDIA GameWorks 3.1, is being released this week.
    NVIDIA GameWorks SDK 3.1 introduces three new graphics technologies involving shadows and lighting. NVIDIA Volumetric Lighting involves simulating how light behaves as it scatters through the air, and was showcased in Fallout 4. Moving over to shadows, we will see NVIDIA Hybrid Frustum Traced Shadows (HFTS) which involves rendering shadows that start as hard shadows nearer the casting object and transition to soft shadows further away. Lastly, in the new graphics features, we see NVIDIA Voxel Accelerated Ambient Occlusion (VXAO), which NVIDIA dubs as the highest quality ambient occlusion algorithm. What makes this version better than previous techniques is the ability to calculate shadows with all geometry in world space, versus older screen space techniques that can only cast shadows for geometry visible to the camera.
    Adding to the roster of PhysX features is NVIDIA PhysX-GRB, which is a new implementation of NVIDIA’s PhysX rigid body dynamics SDK. This new implementation provides a hybrid CPU/GPU pipeline that NVIDIA claims can improve performance by a factor of up to 6X for moderate to heavy simulation loads, especially for those that are large on compute shader register resources. NVIDIA Flow is the other update to PhysX which will introduce the ability to simulate and render combustible fluids such as fire and smoke, and this time simulation will not be confined to a bounding box. This should lead to much more flexibility and usefulness in games and other software in the future.


    More...

  6. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,809
    Post Thanks / Like
    #5826

    Anandtech: Hands On With the Retail Oculus Rift: Countdown to Launch

    Today I'll be going over my hands-on session with the final, retail version of the Oculus Rift, Oculus's soon to launch VR headset. As part of their GDC festivities, Oculus held a lengthy press demo to give us a chance to try out the retail hardware with a number of games being prepared for the headset, to demonstrate not only the hardware but the games and experiences that it will be driving. A full review of the Rift will be coming later, but for today I wanted to discuss my impressions of the retail hardware and the various titles I had a chance to try.

    More...

  7. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,809
    Post Thanks / Like
    #5827

    Anandtech: MSI Introduces two new GTX 950 2GB GPUs with 75W TDP: the OCV2 and OCV3

    A standard GTX 950 reference design comes in at a 90W TDP, which means it requires a 6-pin PCIe power connector as the standard PCIe slot is typically rated to only provide 75W. However, we have seen a couple of AIB partners introducing new 75W versions of the GTX 950 to come in under the limit, and as a result remove the 6-pin PCIe connector needed. One of those is MSI, who has introduced two new video cards that can be powered solely by the PCI Express connection. The low-power GeForce GTX 950 graphics boards from MSI follow similar graphics adapters from ASUS and indicate that there are NVIDIA’s GM206-251 GPUs. At this point we might conclude that the '251' nomenclature refers to the specific silicon design which might afford lower power operation.
    The two new cards from MSI are the GeForce GTX 950 2GD5T OCV3 and the MSI GeForce GTX 950 2GD5T OCV2 and are based on the GeForce GTX 950 (GM206) in its default configuration: 768 stream processors, 48 texture units, 32 ROPs as well as a 128-bit GDDR5 memory interface. The GPUs of both cards are clocked at 1076 MHz, but can increase their frequencies to 1253 MHz in boost mode. Both graphics adapters are equipped with 2 GB of GDDR5 memory clocked at 6.6 Gbps, a dual-slot cooling system with an aluminum heatsink, one DVI connector, one HDMI 2.0 port and one DisplayPort output. As mentioned above, neither of the boards require additional PCIe power connectors and their TDP does not exceed 75 W.
    The two cards differ in their cooling and length. The first one is the GTX 950 2GD5T OCV3 uses a dual fan design and features a longer PCB which is designed to focus on a typical desktop PC. The shorter GTX 950 2GD5T OCV2 uses a single fan and seems to be designed for mini-ITX systems. Thanks to the fact that the GM206 GPU features hardware-accelerated decoding and encoding of H.265 (HEVC) video and fully supports HDCP 2.2 content protection over HDMI 2.0, the OCV2 card could be a fine choice for small form-factor home-theater PCs.
    That being said we have noticed that on the stock photographs that the smaller model eliminates some of the PCI Express pins for an unknown reason, so the exact feature-set is to be confirmed. We have put questions to MSI on this and will update when we have a response.
    Graphics cards based on the GeForce GTX 950 GPUs with lowered power consumption can be used not only to build SFF HTPCs, but also to upgrade cheap desktop systems, which sometimes do not have a spare PCIe power connector inside. Previously NVIDIA’s partners only offered GeForce GTX 750 Ti without auxiliary power connectors, but this adapter is already two years old, and its performance may not enough for modern titles.
    Now there are (at least) two graphics cards suppliers who offer GeForce GTX 950 adapters with lowered power consumption, it is likely that other companies will follow them as well. It is unclear whether NVIDIA officially sells GM206 GPUs with low TDP to address the market of entry-level desktops, or companies like ASUS or MSI simply hand-pick GPUs that do not need more than 75 W of power to function properly. In any case, it is evident that such GPUs exist and are in demand enough that the AIB partners want to produce them.
    NVIDIA Video Card Specification Comparison
    MSI GTX 950 OCV2/3 ASUS GTX950-2G Ref
    GTX 950
    Ref
    GTX 960
    Ref
    GTX 750 Ti
    CUDA Cores 768 1024 640
    Texture Units 48 64 40
    ROPs 32 16
    Core Clock 1076 MHz 1026 MHz 1024 MHz 1126 MHz 1020 MHz
    Boost Clock 1253 MHz 1190 MHz 1188 MHz 1178 MHz 1085 MHz
    Memory Clock 6.6 Gbps
    GDDR5
    7 Gbps
    GDDR5
    5.4 Gbps
    GDDR5
    Memory Bus Width 128-bit
    VRAM 2 GB 2/4 GB 2 GB
    TDP 75 W 90 W 120 W 60 W
    Architecture Maxwell 2 Maxwell 1
    GPU GM206 GM107
    Transistor Count 2.94 B 1.87B
    Manufacturing Process TSMC 28nm TSMC 28nm
    Launch Date 03/16/16 03/16/16 08/20/15 01/22/15 02/18/14
    Launch Price unknown unknown $159 $199 $149
    Exact prices and release dates of MSI’s GeForce GTX 950 2GD5T OCV3 and GeForce GTX 950 2GD5T OCV2 graphics cards are unknown. Keeping in mind that the boards are not exclusive products available only from MSI, their prices will hardly be very high and will be near the GTX 950’s MSRP of $159.
    Gallery: MSI Introduces GeForce GTX 950 Graphics Cards with 75W Power Consumption




    More...

  8. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,809
    Post Thanks / Like
    #5828

    Anandtech: GDC 2016: Imagination Demonstrates PowerVR Vulkan SDK & PowerVR Ray Tracin

    Among the many companies showing off their latest development wares at GDC this week is Imagination. As opposed to their new GPU IP launches over the past couple of months – PowerVR Series7XT Plus and Series8XE – the focus for GDC is showing developers what they can do with their shipping hardware, and what new tools are ready for developers to use for the task.
    First off, following the recent release of the low-level Vulkan API 1.0 specification, Imagination has integrated support for the API into version 4.1 of their PowerVR Graphics SDK. As is the case with PC vendors, for mobile vendors Vulkan is a chance to shift towards a less CPU demanding and multi-threading friendly model for draw call submission, which is all the more important given the high prevalence of 4+ core CPUs on Android devices. To that end the company is also showing off its newest Vulkan tech demo, Sunset Vista, which uses Vulkan for both graphics and compute purposes, combining various graphical effects with a compute shader-driven foliage simulation.
    Meanwhile the company’s ray tracing team is continuing to show off the potential of their PowerVR Wizard technology now that the company has a working in-silicon implementation of the GR6500 ray tracing hardware. Imagination’s latest ray tracing tech demo in turn demonstrates ray tracing implemented over OpenGL ES using Imagination’s proprietary extensions. As with past demos the company is keen to show off hybrid rendering with classes of effects that are difficult or inefficient to implement via rasterization (i.e. regular GPUs) including specular reflections and efficient percentage closer soft shadows.
    Finally, along these lines, the company is also announcing a new collaboration with engine developer Unity this week. In 2014 the two announced that they were working together to implement real-time lightmap previews into the Unity editor. Now in 2016 that collaboration is taking a step forward; Unity and Imagination will be building a full-fledged software ray traced lightmap editor, with the goal of further improving the speed and quality at which lightmaps can be developed in real time. As this is a software implementation it isn't something Wizard GPUs can accelerate at this time, but it's easy to see how if everything were to go well for Wizard, Imagination and Unity could try to make that jump.

    More...

  9. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,809
    Post Thanks / Like
    #5829

    Anandtech: Razer Core Thunderbolt 3 eGFX Chassis: $499/$399, AMD & NVIDIA, Shipping I

    Back at CES 2016, Razer announced their Core Thunderbolt 3 external graphics (eGFX) chassis. Built around the new Thunderbolt 3 standard and its long-awaited official support for external video cards, the Razer Core is the first eGFX chassis to hit the market. We got a bit more information about it with last week’s AMD XConnect announcement, and now today at GDC Razer is releasing the full details on functionality, compatibility, and availability.
    Jumping right into things, the Razer Core is what can be considered a full size eGFX chassis. The unit is large enough to accommodate a double-wide video card up to 12.2 inches long, which covers almost every video card on the market. The only notable exceptions here are cards that use external radiators (e.g. Radeon R9 Fury X) and the small number of ultra-high-end triple-wide card designs such as some of MSI’s Lightning cards. The toolless design is able to handle both open air and blower type video card coolers, however given its smaller size relative to a full PC case, I think it’s going to be worth looking into just how well open air cards do.
    Razer Core Thunderbolt 3 eGFX Chassis Specifications
    Max Video Card Size Double-Wide, 12.2" Long
    (310 x 152 x 44mm)
    Max Video Card Power 375W
    Connectivity 4x USB 3.0
    1x Gigabit Ethernet
    Laptop Charging via Thunderbolt 3
    Chassis Size 4.13 x 13.9 x 8.66 inches
    (105 x 353 x 220mm)
    Internal PSU 500W
    System Requirements Thunderbolt 3 eGFX Certified PC
    Thundebolt 3 w/Active Cable
    Windows 10
    Shipping Date April 2016
    Price $499 ($399 w/Razor laptop)
    The chassis itself measures 4.13” x 13.9” x 8.66” and contains an internal 500W PSU, with Razer rating it to drive up to a 375W video card. At 10.89lbs it is technically portable, though clearly not ideal for the task given its handle-less design. Rather Razer intends this to be a fixed docking station for laptop users, as demonstrated both by the additional ports made available on the Core – 4x USB 3.0 Type-A and a gigabit Ethernet port – and by the fact that it’s capable of charging laptops over its Thunderbolt 3 connection.
    The Core is the first of what we expect will be several TB3 eGFX chassis. As we briefly mentioned in our AMD XConnect overview, the Core is essentially the pathfinder product for the TB3 eGFX standard, with Intel, AMD, and Razer working together to put together the first devices and validate them. Consequently the Core is so far only validated to work with two laptops – Razer’s Blade Stealth and new 2016 Blade – however it should work with any future laptops that are also eGFX certified.
    As for video card compatibility, while the Core was initially developed with Intel and AMD, Razer has confirmed that the chassis does support both AMD and NVIDIA video cards. The full compatibility list is posted below, but for AMD cards it’s essentially all of their latest generation (290 series and newer) video cards. Meanwhile on the NVIDIA side all of the company’s Maxwell 1 and Maxwell 2 cards are supported, starting with the GTX 750 Ti. Though given the price of the Core, it goes without saying that the expectation is that it will be paired up with high-end video cards as opposed to lower-end models.
    Razer Core Video Card Compatibility List
    AMD NVIDIA
    Radeon R9 Fury GeForce GTX Titan X
    Radeon R9 Nano GeForce GTX 980 Ti
    Radeon R9 300 Series GeForce GTX 980
    Radeon R9 290X GeForce GTX 970
    Radeon R9 290 GeForce GTX 960
    Radeon R9 280 GeForce GTX 950
    GeForce GTX 750 Ti
    While the Core supports both AMD and NVIDIA cards, how well each brand is supported is looking a bit hazy. As part of the eGFX development cycle, AMD’s drivers are fully capable of and validated for eGFX plug ‘n play operation, allowing Windows to gracefully handle losing the external GPU with both planned and accidental disconnects. In the case of an accidental disconnect, Windows will stay up, while applications using the GPU may crash. However NVIDIA’s drivers have not yet been validated for plug ‘n play operation, and it sounds like at this moment NVIDIA is still hammering out the final bugs. That said, NVIDIA has committed to having drivers ready by the time the Core ships in April, so we'll have to see where things stand next month.
    Finally, let’s talk about availability and pricing. Razer will begin taking pre-orders for the Core tonight through their website, with the chassis set to ship in April. As for pricing, the first eGFX chassis on the market will not come cheap: Razer is setting the base price on the chassis at $499, and after the cost of a high-end video card to put in the Core, we’re looking at a total price tag of $1000 or more. However Razer is offering a $100 discount on the Core if it’s purchased alongside a Razer Blade or Blade Stealth laptop – bearing in mind that these are the only two laptops eGFX certified at this time in the first place – which brings the effective cost down to $399. Razer also notes that this offer is also retroactive for customers whom already purchased a Blade Stealth earlier this year, as the ultrabook and the Core were announced together at CES and the company doesn’t want to penalize early buyers who were intending to grab the Core anyhow.
    Gallery: Razer Core




    More...

  10. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,809
    Post Thanks / Like
    #5830

    Anandtech: Intel's Skull Canyon NUC is Official

    Back in January, Intel had provided us with information about the Skull Canyon NUC based on a Skylake H-Series CPU(with Iris Pro Graphics). Today, at GDC 2016, Intel made the specifications official. Pricing and availability information was also provided.
    The key aspect that was not revealed before was the dimensions. The Skull Canyon NUC (NUC6i7KYK) will come in at 216mm x 116mm x 23mm, with the volume coming in at just 0.69L. For comparison, the Skylake NUC6i5SYK (non-2.5" drive version) comes in at 115mm x 111mm x 32mm (0.41L), while NUC6i5SYH (2.5" drive bay-enabled) one is 115mm x 111mm x 48mm (0.61L). The rest of the specifications are outlined in the table below:
    Intel NUC6i7KYK (Skull Canyon) Specifications
    Processor Intel Core i7-6770HQ
    Skylake, 4C/8T, 2.6 GHz (Turbo to 3.5 GHz), 14nm, 6MB L2, 45W TDP
    Memory 2x DDR4 SO-DIMM (2133+ MHz)
    Graphics Intel Iris Pro Graphics 580 (Skylake-H GT4+4e with 128MB eDRAM)
    Disk Drive(s) Dual M.2 (SATA3 / PCIe 3.0 x4 NVMe / AHCI SSDs)
    Networking Intel Dual Band Wireless-AC 8260 (2x2 802.11ac - 867 Mbps + Bluetooth 4.2)
    Intel I-219V Gigabit Ethernet
    Audio 3.5mm Audio Jack (Headphone / Microphone)
    Capable of 5.1/7.1 digital output with HD audio bitstreaming (HDMI)
    Miscellaneous I/O Ports 4x USB 3.0 (incl. one charging port)
    1x SDXC (UHS-I)
    1x HDMI 2.0, 1x mini-DP 1.2
    Consumer Infrared Sensor
    Operating System Barebones
    Pricing $650 (Barebones)
    $999 (Typical build with 16GB DDR4, 256GB SSD and Windows 10)
    Fact Sheet Intel NUC6i7KYK GDC Fact Sheet (PDF)
    Note that the HDMI 2.0 output is enabled by an external LSPcon (not Alpine Ridge). So, we will definitely have 4Kp60 output with HDCP 2.2 support over the HDMI port, making it suitable as a future-proof HTPC platform. From a gaming perspective, the availability of Thunderbolt 3 enables users to add an external graphics dock like the recently announced Razer Core eGFX module. Note that any external GPU will be able to talk to the CPU only over a PCIe 3.0 x4 link (which should be plenty in almost all cases).
    Gallery: Intel Skull Canyon NUC - Chassis Design & External I/O


    The Skull Canyon NUC will be available to pre-order on Newegg next month, with shipping in May 2016.


    More...

Thread Information

Users Browsing this Thread

There are currently 31 users browsing this thread. (0 members and 31 guests)

Tags for this Thread

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
Title