Page 560 of 1210 FirstFirst ... 604605105355505555565575585595605615625635645655705856106601060 ... LastLast
Results 5,591 to 5,600 of 12095

Thread: Anandtech News

  1. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,807
    Post Thanks / Like
    #5591

    Anandtech: Samsung introduces Portable SSD T3

    In recent years a new category of portable storage devices has emerged. Based on the same controllers and flash used in 2.5" SATA SSDs, portable SSDs offer much higher performance and capacities than typical of USB thumb drives. The use of SATA to USB3 bridge chips allows portable SSDs to be used with almost any devices, as opposed to relying on the rare eSATA standard. Most portable SSDs also support the USB Attached SCSI Protocol (UASP) to cut down on the overhead relative to a direct SATA connection. Portable SSDs usually can't match the performance of their SATA counterparts, but they are closer to the native performance than to normal thumb drive speeds.
    At last year's CES, Samsung introduced their Portable SSD T1, their first foray into this market. This year they've got a successor, the Portable SSD T3. Externally, the biggest difference is that the T3 switches to a metal case from the black plastic of the T1. This doubles the overall weight, bringing it up to 51 grams. The T3 also adopts the reversible Type C USB port, replacing the T1's micro USB 3 Type B port. The T3 includes a Type C to Type A cable.
    We don't have much information on what's changed internally. The T1 used the same controller and TLC 3D NAND as the 850 EVO. After the launch of the T1, the 850 EVO and Pro product lines gained 2TB models thanks to Samsung's newer MHX controller, which expanded the amount of RAM that could be accessed and allowed the drives to manage twice as much flash. The Portable SSD T3 introduces a 2TB option so we're pretty sure it is also adopting the MHX controller for at least that capacity. Like the 850 EVO and Pro, the smaller capacities may be using the earlier MEX and MGX controllers, but that shouldn't hinder their performance.
    The Portable SSD T3 will be available in early March. Pricing has not been announced.


    More...

  2. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,807
    Post Thanks / Like
    #5592

    Anandtech: ASUS Announces February Launch For The ZenFone Zoom

    Today at CES ASUS showcased their upcoming ZenFone Zoom, and confirmed that it will be coming to the United States this February. ASUS has teased the ZenFone Zoom a couple of times, and most notably showed it at IFA last year. It was also mentioned during their CES keynote last year, and so it has taken quite some time for it to get to market. Just as a refresher, you can find the ZenFone Zoom's specs in the table below.
    ZenFone Zoom
    SoC Intel Atom Z3580/Z3590 Quad Core 2.3/2.5GHz
    PowerVR G6430 GPU
    RAM 4GB LPDDR3
    NAND 64/128GB NAND
    Display 5.5” 1080p
    IPS
    Network 2G / 3G / 4G LTE (Category 4 LTE)
    Dimensions 158.9 x 78.84 x 5-11.9mm, 185g
    Camera 13MP Rear Facing, 3x optical zoom, 28-84mm
    F/2.7-F/4.8, OIS
    5MP Front Facing
    Battery 3000 mAh (11.4 Whr)
    OS Android 5.1 w/ ZenUI
    Connectivity 802.11a/b/g/n/ac + BT 4.0, USB2.0, GPS/GNSS, NFC
    SIM 1 x MicroSIM
    Launch Price $399 (Z3580/64GB)
    The ZenFone Zoom is very similar to the ZenFone 2 as far as the internal specs go. In the 128GB model there is a bump in clock speed, as the SoC moves from Atom Z3580 to Z3590, but the phone remains the same otherwise. Obviously the big attraction is the rear-facing camera with optical zoom. It's not clear exactly which camera sensor is used in the ZenFone Zoom, but I wouldn't be surprised to find that it's the same Toshiba sensor from the ZenFone 2. I hope that ASUS has put a lot of work into improving their camera processing from the state it was in on the ZenFone 2, as hardware has never really been the issue with photo quality on ASUS devices.
    The ZenFone Zoom will be launching in the United States this February, with a starting price of $399 for the 64GB model. Pricing for the 128GB model with the Z3590 SoC is currently unknown.


    More...

  3. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,807
    Post Thanks / Like
    #5593

    Anandtech: Lenovo Announces The Vibe S1 Lite

    Today Lenovo introduced the Vibe S1 Lite at CES in Las Vegas. The Vibe S1 Lite is a mid range smartphone offered by Lenovo which offers some notable specs relative to its price. The known specifications for Lenovo's newest phone are in the chart below, with a few bits of information like the dimensions and WiFi/Bluetooth specifications being unknown at the moment.
    Lenovo Vibe S1 Lite
    SoC MediaTek MTK 6753 Octa Core Cortex A53 @ 1.3GHz
    RAM 2GB LPDDR3
    NAND 16GB NAND + MicroSD
    Display 5” 1080p IPS
    Network 2G / 3G / 4G LTE
    Camera 13MP Rear Facing, PDAF
    8MP Front Facing
    Battery 2700 mAh (10.26 Whr)
    OS Android 5.1
    SIM Dual NanoSIM
    Launch Price $199
    The Lenovo Vibe S1 Lite won't be coming to the US, which is actually a shame because it looks like an interesting phone for the price. For $199 you get a 5" 1920x1080 IPS display, support for LTE, and a 13MP rear-facing Sony camera with support for PDAF. Interestingly enough, Lenovo notes that it's an ISOCELL sensor, even though ISOCELL is really a Samsung specific term for deep trench isolation, which is the process of putting barriers between the pixels on the camera sensor to reduce crosstalk. All of this is run off of a 10.26Wh battery.
    If there's one thing that isn't too exciting about the Vibe S1 Lite it's probably the SoC. MTK 6753 is an octa core Cortex A53 part with a max frequency of 1.3GHz. Considering that you're put on the same playing field as Snapdragon 410 it's definitely not bad for the price, but some of the other specs are definitely a bit more exciting at this price point.
    The last notable thing about the Vibe S1 Lite is the design. Based on the press photos the phone actually looks quite nice for a phone at this price. The color choices are quite interesting, and when they're combined with the metal band around the phone the design reminds me a bit of the Nexus 6. That's somewhat intriguing when you consider that Lenovo now owns Motorola Mobility.
    The Lenovo Vibe S1 Lite will be available in white and blue in the first quarter of this year, for a price of $199 USD. Like I said earlier, it actually will not be sold in the US despite its introduction at CES, but it will be sold in all the markets that Lenovo phones are currently sold in.


    More...

  4. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,807
    Post Thanks / Like
    #5594

    Anandtech: Acer Aspire Unveils Switch 12 S 2-in-1 Notebook

    Acer on Monday introduced its new 2-in-1 hybrid notebook at the Consumer Electronics Show. The Aspire Switch 12 S is designed for those, who need a decent level of performance and features along with a high-resolution display in a sleek form-factor. The new 2-in-1 system features Intel Thunderbolt 3 technology and will be compatible with Acer’s upcoming Graphics Dock, an external graphics solution for mobile PCs. The new hybrid personal computers will hit the market already next month.
    The Acer Aspire Switch 12 S hybrid PC is based on the Intel Core M central processing unit with 4.5W thermal design power and Skylake micro-architecture. The 2-in-1 comes equipped with 4GB or 8GB of dynamic random access memory (DRAM), a 128 GB or a 256 GB solid-state drive, a 802.11 b/g/n/ac Wi-Fi controller with 2x2 MU-MIMO technology, a 720p front-facing webcam, Intel RealSense R200 camera for 3D scanning, two USB 3.0 ports, one Thunderbolt 3/USB 3.1 type-C port as well as a micro-SD card reader. The system is completely fanless thanks to very low TDP of its CPU.
    The 12.5-inch display panel of the Aspire Switch 12 S uses IPS technology along with Corning Gorilla 4 glass for protection, it can feature 1920 × 1080 or 3840 × 2160 resolution, depending on exact configuration. The multi-touch display supports the Acer Active Pen for note-taking and sketching, something, which may be useful for business users and creative professionals.
    Those, who would like to use the Aspire Switch 12 S for gaming will eventually be able to connect an optional Acer Graphics Dock to the Thunderbolt 3 port. At present Acer does not reveal anything about the upcoming Graphics Dock. Considering the fact that Intel’s Core M processors can hardly provide enough horsepower for demanding games, even a mainstream discrete GPU inside the dock will significantly improve gaming capabilities of the laptop. Nonetheless, it is hard to expect the Graphics Dock to transform any low-power 2-in-1 machine into a gaming powerhouse.
    The Acer Aspire Switch 12 S is made of anodized aluminum. The tablet part of the device is about 7.85 mm (0.31 inches) thick and weighs around 800 grams (1.76 pounds). With keyboard dock connected, the 2-in-1 laptop is 17.3 mm thick (0.68 inches) and weighs around 1400 grams (3.09 pounds).
    Acer officially positions its Aspire Switch 12 S for all types of users, including business road warriors, creative professionals as well as mainstream users. The 4K display, support for stylus, Intel Thunderbolt 3 technology, RealSense camera for 3D scanning make the new 2-in-1 PC from Acer considerably more advanced compared to previous-generation hybrids. In fact, 4K display and Thunderbolt 3 support make the Aspire 2-in-1 unique as such combination is rare in general. The Acer Graphics Dock will make the Aspire Switch 12 S somewhat more attractive for gamers. However, the lack of a 4G/LTE module, a fingerprint reader and high-capacity storage options will reduce popularity of the Aspire Switch 12 S among business and professional users.
    The Acer Aspire Switch 12 S will be available in North America in February starting at $999.99. The new 2-in-1 will also hit the markets of Europe, the Middle East and Africa in February with prices starting from €1,199.
    Gallery: Acer Aspire Unveils Switch 12 S 2-in-1 Notebook




    More...

  5. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,807
    Post Thanks / Like
    #5595

    Anandtech: MAINGEAR Rolls-Out 34” All-in-One PC with 18-Core Xeon, GeForce GTX Titan

    The concept of all-in-one desktop personal computers was invented to save space and simplify design of PCs. While there are a lot of traditional AIO desktops available today, leading PC makers began to address performance-demanding market segments with specially-designed models several years ago. At the Consumer Electronics Show on Monday, boutique PC maker MAINGEAR introduced the world’s first AIO desktop featuring top-of-the-range gaming or even server components.
    The MAINGEAR Alpha 34 is a giant all-in-one desktop with 34” curved display with 3840×1440 resolution. Unlike the vast majority of AIO PCs, the Alpha 34 is powered by industry-standard mini-ITX motherboards — the ASUS ROG Maximus VIII Impact or the ASRock X99E-ITX for high-end configurations (Intel H110-based mainboard is available as an option for lower-cost configurations). The system can use various microprocessors in LGA1151 or LGA2011 form-factors, including Intel Core i3/i5/i7 or Intel Xeon with up to 18 cores and up to 45MB of cache. The AIO desktop uses the company’s own closed-loop liquid cooler in order to ensure stability of desktop and server microprocessors.
    The Alpha 34 all-in-one system from MAINGEAR can be equipped with up to 32GB of unbuffered DDR4 memory, one M.2 NVMe solid-state drive and up to two 2.5” storage devices. The AIO PC can accommodate full-sized desktop graphics cards, including the AMD Radeon R9 Nano, the NVIDIA GeForce GTX Titan X, or professional accelerators. The system naturally supports all the connectivity options provided by the aforementioned motherboards, including Gigabit Ethernet, 802.11 a/b/g/n/ac Wi-Fi, 5.1-channel audio, USB 3.0, USB 3.1 connectors and so on.
    MAINGEAR offers if clients to customize their Alpha 34 desktops in a bid to get the configuration they would like. The MAINGEAR Alpha 34 all-in-one personal computer is equipped with a 450W power supply unit, therefore, not all setups are feasible. Multi-core Intel Xeon processors as well as top-of-the-range graphics cards consume a lot of power and 450W may not be enough to feed all the possible configurations.
    Performance of the MAINGEAR Alpha 34 featuring the latest Core i7 processors should be on par with that of high-end tower desktops. Upgradeability of all-in-one systems is not as flexible as that of tower machines, which is one of the reasons why AIOs are not for everyone. To make the Intel Z170-based systems a little more future-proof, the PC maker offers factory overclocking for the Skylake-S microprocessors inside the Alpha 34.
    All MAINGEAR systems — including the Alpha 34 — can be custom painted and equipped with various peripherals like external optical drives, keyboards, mice, headsets and so on.
    Pricing of the MAINGEAR Alpha 34 starts at $1,999. A fully-fledged Alpha 34 gaming AIO PC with premium components, but without custom-finish and peripherals will cost $6,150.99. A workstation machine inside the Alpha 34 chassis will be priced at around $15,000. MAINGEAR will start to ship its Alpha 34 systems starting February 1, 2016.


    More...

  6. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,807
    Post Thanks / Like
    #5596

    Anandtech: NVIDIA Discloses Next-Generation Tegra SoC; Parker Inbound?

    While NVIDIA has been rather quiet about the SoC portion of the DRIVE PX 2, it’s unmistakable that a new iteration of the Tegra SoC is present.
    The GPUs and SoCs of the DRIVE PX 2 are fabricated on TSMC’s 16nm FinFET processes, which is something that we haven’t seen yet from NVIDIA. The other obvious difference is the CPU configuration. While Tegra X1 had four Cortex A57s and four Cortex A53s, this new SoC (Tegra P1?) has four Cortex A57s and two Denver CPUs. As of now it isn’t clear whether this is the same iteration of the Denver architecture that we saw in the Tegra K1. However, regardless of what architecture it is we’re still looking at a CPU architecture that is at least partially an ARM in-order core with a wide, out of order VLIW core that relies on dynamic code optimization to translate ARM instructions into the VLIW core ISA.
    Based on the description of the SoC, while NVIDIA is not formally announcing this new SoC or giving it a name at this time, the feature set lines up fairly well with the original plans for the SoC known as Parker. Before it was bumped to make room for Tegra X1, it had been revealed that Parker would be NVIDIA's first 16nm FinFET SoC, and would contain Denver CPU cores, just like this new SoC.

    NVIDIA's Original 2013 Tegra Roadmap, The Last Sighting of Parker
    Of course Parker was also said to include a Maxwell GPU, whereas NVIDIA has confirmed that this new Tegra is Pascal based. Though with Parker's apparent delay, an upgrade to Pascal makes some sense here. Otherwise we have limited information on the GPU at present besides its Pascal heritage; NVIDIA is not disclosing anything about the number of CUDA cores or other features.
    NVIDIA Tegra Specification Comparison
    X1 2016 "Parker"
    CPU Cores 4x ARM Cortex A57 +
    4x ARM Cortex A53
    2x NVIDIA Denver +
    4x ARM Cortex A57
    CUDA Cores 256 ?
    Memory Clock 1600MHz (LPDDR4) ?
    Memory Bus Width 64-bit ?
    FP16 Peak 1024 GFLOPS ?
    FP32 Peak 512 GFLOPS ?
    GPU Architecture Maxwell Pascal
    Manufacturing Process TSMC 20nm SoC TSMC 16nm FinFET
    But for now the bigger story is the new Tegra's CPU configuration. Needless to say, this is at least somewhat of an oddball architecture. As Denver is a custom CPU core, we’re looking at a custom interconnect by NVIDIA to make the Cortex A57 and Denver cores work together. The question then is why would NVIDIA want to pair up Denver CPU cores with also relatively high performng Cortex A57 cores?
    At least part of the answer is going to rely on whether NVIDIA’s software stack either uses the two clusters in a cluster migration scheme or some kind of HMP scheme. Comments made by NVIDIA during their press conference indicate that they believe the Denver cores on the new Tegra will offer better single-threaded performance than the A57s. Without knowing more about the version of Denver in the new Tegra, this is somewhat surprising as it’s pretty much public that Denver has had issues when dealing with code that doesn’t resemble a non-branching loop, and more troublesome yet code generation for Denver can take up a pretty significant amount of time. As we saw with the Denver TK1, Cortex A57s can actually be faster clock for clock if the code is particularly unfavorable to Denver.
    Consequently, if NVIDIA is using a traditional cluster migration or HMP scheme where Denver is treated as a consistently faster core in all scenarios, I would be at least slightly concerned if NVIDIA decided to ship this configuration with the same iteration of Denver as in the Tegra K1. Though equally likely, NVIDIA has had over a year to refine Denver and may be rolling out an updated (and presumably faster) version for the new Tegra. Otherwise it also wouldn’t surprise me if the vast majority of CPU work for PX 2 is run on the A57 cluster while the Denver cluster is treated as a co-processor of sorts, in which only specific cases can even access the Denver CPUs.


    More...

  7. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,807
    Post Thanks / Like
    #5597

    Anandtech: NVIDIA Announces DRIVE PX 2 - Pascal Power For Self-Driving Cars

    As has become tradition at CES, the first major press conference of the show belongs to NVIDIA. In previous years their press conference would be dedicated to consumer mobile parts – the Tegra division, in other words – while more recently the company’s conference has shifted to a mix of mobile and automobiles. Finally for 2016, NVIDIA has made a full transition over to cars, with this year’s press conference focusing solely on the subject and skipping consumer mobile entirely.
    At CES 2015 NVIDIA announced the DRIVE CX and DRIVE PX systems, with DRIVE CX focusing on cockpit visualization while DRIVE PX was part of a much more ambitious entry into the self-driving vehicle market for NVIDIA. Both systems were based around NVIDIA’s then-new Tegra X1 SoC, implementing it either for its graphics capabilities or its compute capabilities respectively.
    For 2016 however, NVIDIA has doubled-down on self-driving vehicles, dedicating the entire press conference to the concept and filling the conference with suitable product announcements. The headline announcement for this year’s conference then is the successor to NVIDIA’s DRIVE PX system, the aptly named DRIVE PX 2.
    From a hardware perspective the DRIVE PX 2 is essentially picking up from where the original DRIVE PX left off. NVIDIA continues to believe that the solution to self-driving cars is through computer vision realized by neural networks, with more compute power being necessary to get better performance with greater accuracy. To that while DRIVE PX was something of an early system to prove the concept, then DRIVE PX 2 is NVIDIA is thinking much bigger.
    NVIDIA DRIVE PX Specification Comparison
    DRIVE PX DRIVE PX 2
    SoCs 2x Tegra X1 2x Tegra "Parker"
    Discrete GPUs N/A 2x Unknown Pascal
    CPU Cores 8x ARM Cortex-A57 +
    8x ARM Cortex-53
    4x NVIDIA Denver +
    8x ARM Cortex-A57
    GPU Cores 2x Tegra X1 (Maxwell) 2x Tegra "Parker" (Pascal) +
    2x Unknown Pascal
    FP32 TFLOPS > 1 TFLOPS 8 TFLOPS
    FP16 TFLOPS > 2 TFLOPS 16 TFLOPS?
    TDP N/A 250W
    As a result the DRIVE PX 2 is a very powerful – and very power hungry – design meant to offer much greater compute performance than the original DRIVE PX. Based around NVIDIA’s newly disclosed 2016 Tegra (likely to be Parker), the PX 2 incorporates a pair of the SoCs. However in a significant departure from the original PX, the PX 2 also integrates a pair of Pascal discrete GPUs on MXM cards, in order to significantly boost the GPU compute capabilities over what a pair of Tegra processors alone could offer. The end result is that PX 2 packs a total of 4 processors on a single board, essentially combining the two Tegras’ 8 ARM Cortex-A57 and 4 NVIDIA Denver CPU cores with 4 Pascal GPUs.
    NVIDIA is not disclosing anything about the discrete Pascal GPUs at this time beyond the architecture and that, like the new Tegra, they’re built on TSMC’s 16nm FinFET process. However looking at the board held up by NVIDIA CEO Jen-Hsun Huang, it appears to be a sizable card with 8 GDDR5 memory packages on the front. My gut instinct is that this may be the Pascal successor to GM206 with the 8 chips forming a 128-bit memory bus in clamshell mode, but at this point that’s speculation on my part.
    What isn’t in doubt though are the power requirements for PX 2. PX 2 will consume 250W of power – equivalent to today’s GTX 980 Ti and GTX Titan X cards – and will require liquid cooling. NVIDIA’s justification for the design, besides the fact that this much computing power is necessary, is that a liquid cooling system ensures that the PX 2 will receive sufficient cooling in all environmental conditions. More practically though, the company is targeting electric vehicles with this, many of which already use liquid cooling, and as a result are a more natural fit for PX 2’s needs. For all other vehicles the company will also be offering a radiator module to use with the PX 2.
    Otherwise NVIDIA never did disclose the requirements for the original PX, but it’s safe to say that PX 2 is significantly higher. It’s particularly telling that in the official photos of the board with the liquid cooling loops installed, it’s the dGPUs we clearly see attached to the loops. Consequently I wouldn’t be surprised if the bulk of that 250W power consumption comes from the dGPUs rather than the Tegra SoCs.
    As far as performance goes, NVIDIA spent much of the evening comparing the PX 2 to the GeForce GTX Titan X, and for good reason. The PX 2 is rated for 8 TFLOPS of FP32 performance, which puts PX 2 1 TFLOPS ahead of the 7 TFLOPS Titan X. However while those are raw specifications, it’s important to note that Titan X is 1 GPU whereas PX 2 is 4, which means PX 2 will need to work around multi-GPU scaling issues that aren’t an issue for Titan X.
    Curiously, NVIDIA also used the event to introduce a new unit of measurement – the Deep Learning Tera-Op, or DL TOPS – which at 24 is an unusual 3x higher than PX 2’s FP32 performance. Based on everything disclosed by NVIDIA about Pascal so far, we don’t have any reason to believe FP16 performance is more than 2x Pascal’s FP32 performance. So where the extra performance comes from is a mystery at the moment. NVIDIA quoted this and not FP16 FLOPS, so it may include a special case operation (ala the Fused Multiply-Add), or even including the performance of the Denver CPU cores.
    On that note, while DRIVE PX 2 was the focus of NVIDIA’s presentation, it was GTX Titan X that was actually driving all of the real-time presentations. As far as I know we did not actually see any demos being powered by PX 2, and it’s unclear whether PX 2 is even ready for controlled demonstration at this time. NVIDIA mentions in their press release that the PX 2 will be available to early access partners in Q2, with general availability not occurring until Q4.
    Meanwhile along with the PX 2 hardware, NVIDIA also used their conference to reiterate their plans for self-driving cars, and where their hardware and software will fit into this. NVIDIA is still aiming to develop a hardware ecosystem for the automotive industry rather than an end-to-end solution. Which is to say that they want to provide the hardware, while letting their customers develop the software.
    However at the same time, in an action admitting that it’s not always easy for customers to get started from scratch, NVIDIA will also be developing their complete reference platform combining hardware and software. The reference platform includes not just the hardware for self-driving cards – including the PX 2 system and other NVIDIA hardware to train the neural nets – but also software components including the company’s existing DriveWorks SDK, and a pre-trained driving neural net the company is calling DRIVENet.
    Consequently while the company isn’t strictly in the process of developing its own cars, it is essentially in the process of training them. Which means NVIDIA has been sending cars around the Sunnyvale area to record interactions, training the 37 million neuron network how to understand traffic. A significant portion of NVIDIA’s presentation was taken up demonstrating DRIVENet in action, showcasing how well it understood the world using a combination of LIDAR and computer vision, with a GTX Titan X running the network at about 50fps. Ultimately I think it’s fair to say that NVIDIA would rather their customers be doing this, building nets on top of systems like DIGITS, but they also have seen first-hand in previous endeavors that bootstrapping an ecosystem like they desire requires having all of the components already there.
    Finally, NVIDIA also announced that they have lined up their first customer for PX 2: Volvo. In 2017 the company will be outfitting 100 XC90 SUVs with the PX 2, for use in their ongoing self-driving car development efforts.


    More...

  8. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,807
    Post Thanks / Like
    #5598

    Anandtech: HTC Unveils the Vive Pre Dev Kit

    Today HTC has taken the wraps off of the second generation version of the HTC Vive. As you probably know, the HTC Vive is a virtual reality head-mounted display designed and made jointly by HTC and Valve. The consumer launch date for the Vive Pre has been pushed back a couple times now, but certain developers have had access to developer versions of the headset for some time now in order to develop new titles for it or work on adapting existing ones. The new Vive Pre is the second version of the Vive developer kit, and it comes with a number of improvements that bring the Vive closer toward its eventual commercial launch which will be occurring this year.
    The Vive Pre makes some notable additions to the earlier version. First and foremost are the improvements to ergonomics. According to HTC, the headset has basically been redesigned from the ground up to be more compact and fit more comfortably onto your head while also being more stable. The displays have been made brighter and refinements to the entire display and lens stack have improved clarity over the existing model. Finally, there has been a front camera added to the headset. This may seem strange at first, but what the camera allows for is augmented reality experiences where a feed of the real world can be shown to the user and illusions can be projected onto that space by the headset.

    As for the controllers, the design has been overhauled to make them more ergonomic. The buttons have been textured to make them easier to find, and the trigger has been changed to a dual stage switch which allows for interactions with multiple states, such as holding or squeezing something. There's also haptic feedback to go along with interactions, and this is something that can really help the experience when implemented in a proper and subtle manner. Finally, the tracking stations for the controllers have been made smaller and more precise.
    I had a chance to try the new Vive Pre earlier, and it marked my first experience with a virtual reality headset, with the exception of the Nintendo Virtual Boy. While I can't make any statements that compare the new Vive to the old dev kit or to other VR headsets like the Oculus Rift, I can say that the experience with the headset and the controllers was unlike anything I've experienced before. The demo consisted of a virtual environment that simulated some of the challenges one would encounter when climbing Mount Everest. It included very theatrical sweeping shots where you looked over the mountains as though you were flying in the air or riding on a helicopter, as well as interactive segments that simulated crossing over a large pit, and climbing up a ladder.
    What amazed me was how quickly I forgot about the fact that I was just in a hotel room wearing a rather large helmet and holding some controllers, and I found myself too frightened to look right over the edge of a cliff, and felt a strange feeling when I climbed the ladder as though I was nervous with my increasing height, even though I knew very well that I was standing on the floor the entire time. Head tracking latency was also very low, and to be honest the only thing that ever took me out of the experience was the limited resolution of the displays. That's a technology issue that will be improved with time, but even with that barrier to total immersion the experience is still extremely compelling and unlike anything else.
    As of right now, the HTC Vive is scheduled to launch commercially in April of this year. Whether or not that date will be pushed back again is unknown, but what I can say is that I think the Vive and other VR headsets will have been worth the wait.


    More...

  9. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,807
    Post Thanks / Like
    #5599

    Anandtech: Seagate Updates DAS Portfolio at CES 2016

    Seagate has announced four new DAS (direct attached storage) products at CES 2016. Three of them target the premium / luxury market under the LaCie brand name.

    • Seagate Backup Plus Ultra Slim USB 3.0 bus-powered external hard drive
    • LaCie Porsche Design USB 3.0 Type-C bus-powered external hard drive (mobile model)
    • LaCie Porsche Design USB 3.0 Type-C external hard drive (desktop model) with power delivery
    • LaCie Chrome USB 3.1 Type-C external SSD

    The LaCie Chrome USB 3.1 Type-C external SSD is easily the most impressive announcement of the four.
    Obviously, one of the key points of the LaCie products is the striking industrial design, and the Chrome is no exception.
    The product contains two 512GB M.2 SATA SSDs in RAID-0 (effective user capacity is 1TB). It can support data rates of up to 940 MBps, thanks to the integrated ASMedia ASM1352R dual SATA to USB 3.1 Gen 2 bridge chip.
    Seagate touts the aluminium enclosure, efficient triple cooling system, magnetized cable management (it is similar to the 2big Thunderbolt 2 product in this respect) and a removable magnetized display stand as unique features for this product.
    Gallery: LaCie Chrome USB 3.1 External SSD


    It must be noted that the Chrome does need an external power connector (understandable due to the need to power two M.2 SSDs). The above gallery shows us the various external aspects of the Chrome unit.
    The unit will retail for $1100 and be available later this quarter.
    The LaCie Porsche Design USB 3.0 Type-C external hard drives have a new industrial design for the aluminium enclosure and come with a Type-C connector. Other than that, there is nothing too striking about them. The desktop model needs external power, but, it also does power delivery over its Type-C port (making it ideal for devices like the MacBook). Both the Mobile and Desktop versions come with an USB Type-A to USB Type-C cable also (in addition to the Type-C to Type-C cable). This enables compatibility with a wider variety of systems.
    The Mobile version comes in 1TB, 2TB and 4TB capacities, starting at $110. The Desktop Drive comes in 4TB, 5TB and 8TB capacities, starting at $210.
    Rounding up the product launches is the Seagate Backup Plus Ultra Slim. It is a 2.5" hard drive, and the firmware features are similar to the Seagate Backup Plus we reviewed last August. This implies the integration of a Seagate Dashboard for providing more features compared to a standard external hard drive. The device also comes with 200GB of OneDrive cloud storage valid for two years. It is also compatible with the Lyve photo management software.
    The technically interesting aspects include the 9.6mm thickness (Seagate indicated that it is the thinnest external hard drive in its capacity class in the market right now). It comes in 1TB and 2TB capacities with a two-platter design. Cross-platform compatibility is enabled by a free Paragon driver download (enabling Macs to read drives formatted in NTFS and Windows PCs to read drives formatted in HFS+).
    The Seagate Backup Plus Ultra Slim comes in 1TB and 2TB capacities. We don't have pricing details yet, but, availability is slated for later this quarter.


    More...

  10. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,807
    Post Thanks / Like
    #5600

    Anandtech: Toshiba’s DynaPad Tablet to Hit Stores in Late January

    Toshiba showcased its ultra-thin dynaPad tablet in September, 2015, at IFA in Berlin, Germany, and then formally introduced it in mid-October. At the International CES 2016, the company finally revealed that the dynaPad will hit the U.S. market later this month. Toshiba says that its new 12-inch tablet is among the thinnest Windows 10-based devices of such kind.
    The Toshiba dynaPad tablet features a 12-inch display with 1920×1280 resolution, which is covered with Corning’s Gorilla Glass 3 as well as with a special anti-fingerprint coating. The device is equipped with Toshiba’s active electrostatics (ES) stylus with Wacom Feel technology that supports 2048 levels of pressure sensitivity. The digitizer pen can last for more than 1000 hours on one charge and can be used for note taking, sketching and drawing. In addition, Toshiba offers a special keyboard dock for its dynaPad, which can be used to convert the slate into a laptop.
    The dynaPad tablet from Toshiba uses Microsoft Windows 10 operating system and is based on the Intel Atom x5 Z8300 system-on-chip (four cores, 2MB cache, 1.44 GHz – 1.84 GHz clock-rate, built-in Intel HD Graphics core with 12 execution units, 2 W thermal design power, 14 nm process technology). The SoC of the dynaPad is similar to that used by Microsoft’s Surface 3, but it runs at a lower frequency and thus has lower performance.
    Toshiba’s dynaPad also comes with up to 4 GB of DDR3L RAM, up to 64 GB of NAND flash storage, Wi-Fi (802.11ac) and Bluetooth 4.0 wireless technologies, a 2 MP front-facing and an 8 MP back-facing cameras, various sensors and so on. The dynaPad sports two micro USB 2.0 ports, a microSD card slot and a micro HDMI port for connecting to external displays. Toshiba yet has to reveal precise specifications and configurations of its dynaPad.
    The new tablet from Toshiba weighs 580 grams (1.28 pounds) and measures about 6.9 mm (0.27 inch) thin. When the keyboard is attached, the weight increases to around 1000 grams (2.2 pounds). Toshiba has not released precise details about battery life of its new tablet.
    Toshiba plans to start selling its dynaPad online and at Microsoft Stores in late January. The most affordable version will cost $569.99.
    The Toshiba dynaPad looks like a relatively powerful solution for various tasks usually performed on tablets. It has a fine 12-inch display and comes with a digitizer pen. By contrast, Microsoft’s Surface 3 sports a 10.1-inch screen and does not come with a stylus (it has to be bought separately). Moreover, Toshiba’s tablet is also thinner and lighter than Microsoft’s Surface 3. In fact, thickness is the dynaPad is similar to that of Apple’s iPad Pro, which also has a 12-inch display, but the latter weighs considerably more (713 grams, 1.572 pounds).
    Even though Toshiba has been trying to refocus its PC business and concentrate on business and enterprise customers, it continues to release consumer devices that look very interesting, at least, on paper. The dynaBook with its rather low weight, relatively low price, advanced stylus and decent capabilities looks like a viable rival not only for Microsoft’s Surface 3, but also for Apple’s iPad Air and iPad Pro.
    Gallery: Toshiba’s DynaPad Tablet to Hit Stores in Late January




    More...

Thread Information

Users Browsing this Thread

There are currently 64 users browsing this thread. (0 members and 64 guests)

Tags for this Thread

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
Title