Page 385 of 1210 FirstFirst ... 285335360375380381382383384385386387388389390395410435485885 ... LastLast
Results 3,841 to 3,850 of 12096

Thread: Anandtech News

  1. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,809
    Post Thanks / Like
    #3841

    Anandtech: Running an NVIDIA GTX 780 Ti Over Thunderbolt 2

    A common issue for laptop users is the lack of GPU power. Even the fastest mobile GPUs, in SLI or Crossfire cannot reach the echelons of performance of a higher-end desktop, mainly due to the power consumption and heat generation. Not only that, laptops with high-end mobile GPUs in order to cope with heat generation tend to be far from portable. Sure, they are still easy to carry around compared to a full-size desktop system, but not many are willing to carry one around on a daily basis. In other words, if you want a laptop that's relatively portable, you are left with mediocre GPU performance that usually doesn't satisfy the needs if you happen to be an active gamer.
    Ever since the original Thunderbolt was released back in 2011, there has been a lot of discussion about the potential of using Thunderbolt for external GPUs. Today's mobile CPUs are far more than capable of driving desktop GPUs and as Thunderbolt is essentially just PCIe and DisplayPort in a single interface, a laptop with an external GPUs makes almost too much sense.
    SilverStone's/ASUS' Thunderbolt eGPU enclosure at CES
    So far a handful of companies, such as MSI and SilverStone, have showcased their external Thunderbolt GPU enclosures at trade shows, but due to issues such as performance and hot-plug applications, no-one has made it to retail . Intel's decision to double the bandwidth with Thunderbolt 2 negated the launch of the original Thunderbolt-based designs, although with any luck TB2 should be an appropriate drop-in. Especially with GPUs, bandwidth can make a dramatic difference in performance and given the niche of external Thunderbolt GPUs, many users wouldn't have been satisfied with a product that doesn't provide at least near the maximum performance.
    Another big issue is obviously driver and operating system support. To make matters worse, nearly all Thunderbolt-equipped devices are Macs and traditionally Apple likes to have very tight control over drivers and other elements of the OS, making it hard (or even impossible) to develop an external GPU that would also function under OS X. In the PC arena, a few motherboards and products exhibit Thunderbolt support, and it is primarily up to Intel working with Microsoft to develop Windows based drivers.
    DIY to the Rescue!
    Disclaimer: All information and results here are based on a forum post and a (now private) Youtube video. We cannot guarantee that the results are accurate, thus any and all purchase decisions must be done at own risk with the possibility that the results may or may not be on par with what is reported below.
    As no company has set forth and commercialized their products yet, the enthusiast group has been looking for a do-it-yourself method to drive an external GPU over Thunderbolt. I came by a very interesting setup over at Tech Inferno forums today and thought I would share it with a larger readership here. A forum member squinks has managed to run an NVIDIA GTX 780 Ti over Thunderbolt 2 using Sonnet's Echo Express III-D chassis with Corsair's RM450 power supply dedicated to the GPU.
    Courtesy of Tech Inferno forum user squinks
    The results are certainly auspicious. Based on squinks' own tests and GTX 780 Ti reviews posted online, the performance seems to be around 80-90% of the full desktop performance based on synthetic benchmarks (3DMark and Unigine Heaven). Given that Thunderbolt 2 offers only 20Gbit/s of bandwidth while a PCIe 3.0 x16 slot offers 128Gbit/s, getting 80-90% of the performance is a lot more than expected. This will vary depending on the game, as based on our own PCIe scaling tests the PCIe bandwidth may cause little to no difference in some games while in others the drop can be close to 50%. The more severe the needs of the PCIe connection, the worse the performance. Either way, Thunderbolt 2 can show a potential for external GPUs even when talking about the most powerful ones that should also require the most bandwidth.
    According to the forum posts, the setup is also pretty much plug and play, but only as long as the GPU is connected to an external monitor. Once everything has been connected and drivers installed, the GTX 780 Ti will be recognized as a GPU like it would in any desktop system. Getting the external GPU to drive the internal display is also possible, although there apparently seems to be some limitations with this homebrew method. First off, it only works if the computer doesn't have a discrete GPU as then NVIDIA Optimus can be used to enable GPU switching. If there is already a discrete GPU in place (like the GT 750M in the high-end 2013 rMBP), then Optimus cannot be used and unfortunately you'll be limited to an external monitor. Secondly, there seems to be some loss in performance (~5-20% in addition to the original loss from Thunderbolt 2) when driving the internal display, which is likely due to Optimus and its limitations.
    The big question is whether such setup reasonably affordable in any way. Currently, the short answer is no. The Sonnet Echo Express III-D chassis alone costs $979, and you'll need to add the cost of the GPU and power supply to that. The weight of the chassis is also 7.5lb (3.4kg), without the GPU or the power supply, hardly making it all that portable. In total this means ~$1500 minimum if you are going with a higher-end GPU (which you should given the cost of the chassis). For comparison's sake, I quickly gathered parts for a decent gaming rig in NewEgg and the total came in at $764.94 (without GPU and PSU). That's with a Core i7-4770K, ASUS Z87 motherboard, 8GB of DDR3-1600, 120GB SSD, 1TB hard drive and mid-price case, so we are not even dealing with a budget system. In other words, you can build a higher performance system for over $200 less and take full advantage of your GPU.
    All in all, it is fun to see an external GPU connected via Thunderbolt 2 can actually get up and running. Apparently some guys at Tech Inferno have been doing this for a while using Thunderbolt to ExpressCard and ExpressCard to PCIe adapters, which in itself sounds like a high-latency scenario. The price and DIY-ness are currently factors that don't exactly allure the masses, but there is a potential market for a retail product that is designed specifically for this. Pricing is, of course, a major factor and at $200-$300 I could see external GPUs gaining popularity but once you go over $500 it, in most cases, becomes more viable to build a dedicated gaming rig.


    More...

  2. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,809
    Post Thanks / Like
    #3842

    Anandtech: A Discussion on Material Choices in Mobile

    Within the past four years, the smartphone market has changed drastically. Displays have dramatically increased in quality. Battery life has increased. As OEMs converge on largely similar platforms, the material design of a phone has become increasingly important. Almost every OEM has had a major shift in the material design of their devices as the market becomes increasingly saturated and competitive. This is especially true at the high end where the upgrade cycle has become lengthened. As people find less and less reason to upgrade to the latest and greatest, OEMs have to change things to stave off decreasing growth. Overall, this seems to mean going "back to basics" with their new devices, which often entails improved material design.
    This seems to make very little sense, especially when there are a great deal of people that simply aren’t concerned with materials. It's not unusual to hear the argument that because everyone uses a case, the design of the device shouldn’t matter. It's often said that aluminum devices are less durable, heavier, and with worse radio reception than one made of polycarbonate. Other issues often cited include uncomfortable skin temperatures under load. Higher cost is also a problem, one that OEMs will often cite internally. With glass, it's almost universally understood that any drop risks shattering the brittle back. So the question remains: Why is it that OEMs continue to push material design?
    Without a doubt, this is a complex topic. Material choices entail a huge number of trade-offs. There isn't any one material that has the best compromises either. For the most part, there are three key materials that smartphones are made from. These three materials are plastic, glass, and metal.
    Plastic

    Within plastics, the most commonly used material is polycarbonate, which has high impact resistance, relatively good temperature resistance, and it’s also extremely flexible. A great example of polycarbonate would be the battery door of the Galaxy S and Note line, or Nokia's Lumia devices. In general, it’s almost impossible to point to a phone with a polycarbonate external build that has reception issues, as polycarbonate effectively doesn’t attenuate radio signals, as shown on page 38 of this study of radio propagation differences. As the market becomes squeezed by decreasing profit margins, the low price of polycarbonate relative to glass or metal is also a significant advantage that can't be overlooked.
    While these are advantages are reason enough to make a smartphone or tablet with a polycarbonate casing, there are disadvantages as well. Polycarbonate is a poor conductor of heat, which means that with today’s thermally constrained devices, true nominal clockspeed for both CPU and GPU on the SoC will be lower than a phone or tablet that is made of metal such as aluminum and magnesium. The same is also true when comparing a device made with polycarbonate and one with glass in general. For reference, the thermal conductivity of aluminum is 205 watts per meter kelvin, 156 W/m*K for magnesium, .8 W/m*K for ordinary glass, and .22 W/m*K for polycarbonate. The unit refers to the rate of energy transfer needed to heat up a length of material by a certain unit of temperature. This means that in today's phones and tablets, one made from plastic will generally run slower in intensive games than one made of metal or glass, if all else is equal.
    On top of this, while polycarbonate is extremely impact resistant, the flexibility of the material is a major issue for smartphones that have to be as thin and compact as possible. People often bring up the car analogy to argue that polycarbonate protects a phone or tablet better, but there's no such thing as a crumple zone in a phone. Even the back cover serves a purpose, as antennas are inserted into the back cover in order to have the space for the huge number of frequencies supported. Bending the back cover into the phone is often a dangerous problem, as it will affect the delicate antenna connectors, which are often small, spring-like pieces that make contact with the back cover. This is an issue, as while the metal antenna connectors are elastic to a certain extent, once stretched too far it won’t spring back. One of the most notable examples of this issue can be seen with the Tegra 3 variant of the HTC One X. This variant of the One X would often lose all WiFi and Bluetooth reception due to crushed antenna connectors. Fixing the problem required additional reinforcement to prevent the back cover from bending in too far. As seen in the photo from iFixit's teardown of the Galaxy S4 below, the connectors are small gold-plated nubs that touch parts of the back cover.
    Metal

    On the other end, metal is often hailed by reviewers as a superior material. However, most reviewers focus upon look and feel, rather than the technical advantages and disadvantages of using metal. Of course, metal is an extremely broad term and comprises around 80% of the known elements in existence. For this discussion, the key metal used for the outer casing is aluminum. Magnesium is another commonly-used material, but is mostly limited to the midframe.
    Of course, aluminum alloys have their advantages as well. With the stiff alloys used in smartphones and tablets, there is a significant structural advantage that helps to protect internal components. To go back to the car analogy, because there aren't crumple zones in a compact phone or tablet, there is only the safety cell. The safety cell is made to be as rigid as possible to prevent crushing the contents within the safety cell. In short, aluminum is actually more durable, not less. As pointed out back in the iPhone 4 review, the external antennas required by all-aluminum designs can give better reception and performance than internal antennas. Even today, it's possible to see better reception performance from an all-metal device. For example, the Sprint One M8 has a higher effective isotropically radiated power (EIRP) on 1900 MHz CDMA than the Sprint Galaxy S5. Higher EIRP generally translates into better radio reception, although it also takes effective isotropic sensitivity (EIS) to see the full picture.
    Because of metal’s higher stiffness, it’s also harder to scratch the surface. However, with anodization treatment, scratches are more likely be visible if they expose the untreated surface. Another key advantage is the much higher thermal conductivity of aluminum, which allows for better performance in situations where a device is thermally limited. After all, heat sinks are made of aluminum and/or copper, not polycarbonate or glass. The best example of this can be seen in the comparison between the Galaxy S5 and HTC One (M8) in the T-Rex rundown test, as the frame rate of the One (M8) is significantly higher than the Galaxy S5’s, as seen in the graph below.
    Like every other material, Aluminum is also not the perfect material to make a mobile device. As a result of making the device from metal, it’s impossible to use internal antennas unless plastic/glass “windows” are used to allow signal in and out of the phone. This means that the device will be less isotropic (direction-independent) in its reception of radio signals. Even with external antennas that turn parts of the metal casing into an antenna, detuning that occurs when a hand touches the antenna or bridges it to another conductive body is a major problem, as is the need to support multiple frequencies with an external body that isn’t necessarily able to change, as the iPhone 5s/HTC One (M8) can't look radically different from operator to operator. While the use of multiple antennas (receive/transmit diversity) and active antenna tuners have made all-metal designs possible, there is still a noticeable difference in radio reception. Whether this difference is for better or worse depends upon the frequency used.
    Outside of radio reception, aluminum alloys’ lower limit of elastic deformation means that while the casing is better at protecting internal components, it's more likely to receive cosmetic damage. On the other hand, polycarbonate is more likely to come out of a drop without dents or gouges. Aluminum bodies are also significantly more expensive, as the time and cost associated with working the material into a final product means that the difference in price can be as great as an order of magnitude. This can take away from other aspects of the device. Finally, while aluminum is far more effective at dissipating heat than polycarbonate, this also means that the polycarbonate device will have lower perceived skin temperatures under load. What that means is that it’s more comfortable to hold a polycarbonate-bodied device even if the internals are at higher temperatures. This also means that low temperatures will cause an aluminum-bodied device to feel much colder than a polycarbonate-bodied one.
    Of course, magnesium changes things up as well. It's lighter than aluminum due to its lower density, can be RF transparent using specific coatings, and in general, carries many of the advantages that aluminum does over plastics such as polycarbonate and aluminosilicate glass, which include high thermal conductivity, relatively high rigidity, and relatively better scratch resistance. In theory then, magnesium would be better than aluminum.
    Unfortunately, from a mass production standpoint magnesium casings are generally infeasible. This is primarily due to the reactive nature of magnesium in oxygenated environments, and due to outgassing that causes bubbles and other surface defects in the material during surface treatments. Without surface treatment, magnesium rapidly corrodes as well. This means that it's not currently feasible to use magnesium as an external casing, although many manufacturers use it for the midframe.
    Glass

    Not to be forgotten, glass is another possibility for the external casing of a tablet or smartphone. It is the most rigid of all three materials and resists scratches the best. However, it is also the most brittle and susceptible to shattering. This is because glass can only deform elastically. Aluminosilicate glass, more commonly referred to as Gorilla Glass (when made by Corning) is the most common type of glass used for the external casing of a phone. It is between aluminum and polycarbonate when it comes to thermal conductivity. However, it is only slightly more conductive than polycarbonate, and far less conductive than aluminum. It also doesn’t significantly attenuate radio signals, which means that internal antennas can be used. Of course, the disadvantage is that glass is incredibly fragile, and can pose a major safety hazard. The shape of the phone is also significantly constrained. This is why glass-bodied devices have generally been small and the glass portion of the device is generally a flat sheet.
    As mentioned previously, there are plenty of complications as well once you factor in the actual layout of the device. Thermal dissipation of a polycarbonate-bodied device can be improved by using a magnesium midframe that dissipates heat into the display and other components. This increases the rate of heat transfer from internal components to the air/hand. Wall thicknesses and different types of plastic, metal, and glass can significantly decrease the severity of issues associated with the disadvantages of various materials. For example, adding ABS plastic to polycarbonate can significantly increase the rigidity of the material. Applying anti-shatter film to glass can catch shards in case the glass shatters to reduce the hazard involved in shattered glass. New antenna tuning technologies can enable all-metal devices.
    Conclusion

    Of course, the question still remains. Why is it that all of this matters? After all, Apple didn't have to worry about thermal dissipation with the iPhone 4 because the SoC didn't generate enough heat, but they used a steel side ring and glass back cover. While the glass back cover and stainless steel ring was more effective at protecting the internal components, minor improvements to drop protection and possible improvements to reception wouldn't be strong justifications for pursuing such a design. So why would Apple do this?
    The answer lies in industrial design. While it's all too easy to conflate this with pure aesthetics, industrial design is a crucial aspect of any device. After all, smartphones and tablets are both touched all the time, and while we look at the display primarily, the shape, look, and feel all dramatically affect the experience. If it fits the hand better, feels better, and looks better, it is better. Unnecessary elements hurt the focus of the design, honest design helps it. Good design is obvious and invisible. It's only when we use something poorly designed that we see what is well-designed. Advances in technology can and do fix the issues that materials have, but nothing can fix bad design. While most of these are subjective, as the mobile industry reaches saturation, both industrial design and material design will become crucial differentiators. If anything, it already has.


    More...

  3. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,809
    Post Thanks / Like
    #3843

    Anandtech: Corsair’s AX1500i Released: A 1500W 80 Plus Titanium PSU

    Shopping around for a power supply on a tight budget can be a bit of an ordeal. On forums, everyone will have their own opinion of what constitutes a good power supply, and similarly to mechanical HDDs, a single bad experience can put a user off a brand forever. My golden rule, unless you need a specific feature/amperage on the power lines for unique GPUs, is to take the total power draw of your system and add 40%. My analogy is thus – a car whose top speed is 80mph will squeak and rattle if you run it every day at 70mph, whereas a car whose top speed is 130mph will hum along nicely at 70mph. Others may disagree, but I find this is a nice guideline when building systems for family and friends.
    Most desktop systems bought and sold today are often very basic, with integrated or a low end graphics card, making power requirements very low. However the extreme is also true, with users wanting to make the most out of three or four end GPUs with a heavy deal of overclocking. If you can recall our Gaming CPU article from April 2013 we used a 24-thread dual-processor system with four 7970 GPUs, lightly overclocked, which drew 1550W at load. This is why power supplies north of 1000W exist, and it can be very frustrating to get these units to be very efficient. To that end, Corsair is releasing today their AX1500i, a 1500W model certified with 80 Plus Titanium qualifications.
    80 Plus Titanium is a newer addition to the 80 Plus, derived from server requirements and first realised back in 2012. As with all 80 Plus specifications, it requires a specific efficiency at 20%, 50% and 100% loading (it can be any efficiency in between these values), although Titanium also adds an element for 10% loading. For the AX1500i, this means a minimum efficiency rating of 90/92/94/90% for 10/20/50/100% loading in the 110-120V regions and 90/94/96/91% for 220V+ regions.
    The Corsair design implements their Zero-RPM Fan technology, meaning the power supply fan will only activate when a 450W load or above is detected.
    The supply comes with ten connectors for PCIe devices, is fully modular, and has native USB support for Corsair Link for monitoring the power supply. This includes real-time temperature, power use and efficiency ratings in the operating system. The AX1500i blows the ATX specification out of the water in terms of size, measuring 225mm (8.86in) long, which is still shorter than a big GPU.
    The price is not for the faint hearted: $450 MSRP, to be initially available direct from Corsair followed by worldwide distributors in late May. This price is indicative of the high power rating combined with the high efficiency certification, as well as a 7-year warranty. I have already seen interest online from extreme overclockers and modders designing hardcore top-end desktop machines, which indicates the niche that Corsair believes this supply will fit in to.
    Gallery: Corsair’s AX1500i Released: A 1500W 80 Plus Titanium PSU



    More...

  4. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,809
    Post Thanks / Like
    #3844

    Anandtech: AMD Announces Project Skybridge: Pin-Compatible ARM and x86 SoCs in 2015,

    This morning AMD decided to provide an update on its CPU core/SoC roadmap, particularly as it pertains to the ARM side of the business. AMD already committed to releasing a 28nm 8-core Cortex A57 based Opteron SoC this year. That particular SoC is aimed at the enterprise exclusively and doesn't ship with an on-die GPU.
    Next year, AMD will release a low-power 20nm Cortex A57 based SoC with integrated Graphics Core Next GPU. The big news? The 20nm ARM based SoC will be pin compatible with AMD's next-generation low power x86 SoC (using Puma+ cores). The ARM SoC will also be AMD's first official Android platform.
    I don't expect we'll see standard socketed desktop boards that are compatible with both ARM and x86 SoCs, but a pin compatible design will have some benefits for embedded, BGA solutions.
    AMD's motivation behind offering both ARM and x86 designs is pretty simple. The TAM (Total Addressable Market) for x86 is decreasing, while it's increasing for ARM. AMD is no longer married to x86 exclusively and by offering OEMs pin compatible x86/ARM solutions it gets to play in both markets, as well as benefit if one increases at the expense of the other.
    Note that we're still talking about mobile phone/tablet class CPU cores here (Cortex A57/Puma+). AMD has yet to talk about what it wants to do at the high end, but I suspect there's a strategy there as well.


    More...

  5. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,809
    Post Thanks / Like
    #3845

    Anandtech: AMD is also working on a new 64-bit x86 Core

    Jim Keller joined Mark Papermaster on stage at AMD's Core Innovation Update press conference and added a few more details to AMD's K12 announcement. Keller stressed AMD's expertise in building high frequency cores, as well as marrying the strengths of AMD's big cores with those of its low power cores. The resulting K12 core is a 64-bit ARM design, but Jim Keller also revealed that his team is working on a corresponding 64-bit x86 core.
    The x86 counterpart doesn't have a publicly known name at this point, but it is a new design built from the ground up.


    More...

  6. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,809
    Post Thanks / Like
    #3846

    Anandtech: AMD Announces K12 Core: Custom 64-bit ARM Design in 2016

    In 2015 AMD will launch project SkyBridge, a pair of pin-compatible ARM and x86 based SoCs. Leveraging next generation Puma+ x86 cores or ARM's Cortex A57 cores, these SoCs form the foundation of the next phase in AMD's evolution where ARM and x86 are treated like equal class citizens. As I mentioned in today's post however, both of these designs really aim at the lower end of the performance segment. To address a higher performance market, AMD is doing what many ARM partners have done and is leveraging an ARM architecture license to design their own microarchitecture.

    In 2016 AMD will release its first custom 64-bit ARMv8 CPU core, codenamed K12. Jim Keller is leading the team that is designing the K12, as well as a corresponding new 64-bit x86 design. AMD is pretty quiet about K12 details at this point given how far away it is. Given the timing I'm assuming we're talking about a 14/16nm FinFET SoC. On the slide above we see that AMD is not only targeting servers and embedded markets, but also ultra low power client devices for its 64-bit ARM designs (presumably notebooks, chromebooks, tablets). AMD has shied away from playing in the phone market directly, but it could conceivably play in that space with its semi-custom business (offering just a CPU/GPU core with other IP). Update: AMD added that server, embedded and semi-custom markets are obvious targets for K12.
    There's also this discussion of modularity, treating both ARM and x86 cores as IP modules rather than discrete designs. AMD continues to have a lot of expertise in SoC design, all it really needs is a focus on improving single threaded performance. I can only hope (assume?) that K12 won't be Bulldozer-like and will hopefully prioritize single threaded performance. It's important to point out that there hasn't been a single reference to the Bulldozer family of CPU cores in any of these announcements either...
    Update: Jim Keller added some details on K12. He referenced AMD's knowledge of doing high frequency designs as well as "extending the range" that ARM is in. Keller also mentioned he told his team to take the best of the big and little cores that AMD presently makes in putting together this design.



    More...

  7. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,809
    Post Thanks / Like
    #3847

    Anandtech: ARM Shares Updated Cortex A53/A57 Performance Expectations

    With the first Cortex A53 based SoCs due to ship in the coming months, and Cortex A57 based designs to follow early next year, ARM gave us a quick update on performance expectations for both cores. Given the timing of both designs we'll see a combination of solutions built on presently available manufacturing processes (e.g. 28nm) as well as next gen offerings (20/16FF). The graph above gives us an updated look at performance expectations (in web browsing workloads) for both ARM 64-bit cores.
    If we compare across the same process nodes (28nm in both cases), the Cortex A53 should give us nearly a 50% increase in performance compared to ARM's Cortex A7. The Cortex A57 should offer roughly the same increase in performance compared to Cortex A15 as well. Although the A57 will do so at higher power, power efficiency may be better depending on the workload thanks to the added performance. Thankfully we won't see many A57 designs built on 28nm in mobile (AMD's first Cortex A57 design will be aimed at servers and is built on a 28nm process).
    If you combine architectural improvements with a new process node, the gains are substantial. Move to 20nm or 16FF for these designs and the improvement over their 32-bit predecessors easily exceeds 50%.
    ARM also provided some Geekbench 3 performance data comparing the Cortex A57 to A15, both in 32-bit and 64-bit mode. We already know Geekbench 3 is particularly sensitive to the new instructions that come along with AArch64, but even in 32-bit mode there's still a 15 - 30% increase in performance over the Cortex A15 at the same clocks.
    Qualcomm has already announced its Snapdragon 410, 610 and 615 will use ARM's Cortex A53, while its 808 and 810 will use a combination of Cortex A53s and A57s.


    More...

  8. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,809
    Post Thanks / Like
    #3848

    Anandtech: ARM's Impact on the Chip Market: 100M China-Designed SoCs Shipped in 2013

    At its second ever Tech Day, ARM shared a pretty interesting slide about its impact on the mobile SoC market. ARM's business model allows for pretty much anyone to be a player in the SoC space. This is in stark contrast to the PC business that's dominated by a single silicon player, with perhaps one lower volume second source. ARM's IP licensing business has paved the way for a number of low cost SoC vendors, particularly those based in China, to enjoy substantial marketshare. While we've only reviewed a single MediaTek based device on AnandTech, the numbers out there are increasing tremendously.
    Tablets in particular are the perfect target for low cost SoCs given that you can successfully ship a WiFi-only design. ARM's chart above shows just how successful its China-based SoC vendors have been in the tablet space, shipping over 100M SoCs in 2013 (~40% of ARM's tablet business).



    More...

  9. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,809
    Post Thanks / Like
    #3849

    Anandtech: Lenovo’s First Consumer Chromebook: N20 and N20p for $279 to $329

    Chromebooks still sit at number two and three on Amazon’s best-selling laptop computer range, as well as 1-6 in the highest rated section, so it would make sense for a company as large as Lenovo to jump onto this market. Lenovo has released Chromebooks in the past, namely the ThinkPad 11e and Yoga 11e both for education, but the new N20 and N20p will be the first available to the general public.
    I am awaiting a response from Lenovo for exact specifications and more images, however both models will feature a Celeron processor and a 1366x768 11.6-inch screen capable of 10-point multi-touch. Both models will come with 16GB internal memory plus 100GB of cloud storage, as well as having what Lenovo call 'a full-sized keyboard and oversized trackpad' for ease-of-use. Lenovo mentions WiFi and Bluetooth connectivity, however not the standard or the implementation. Maximum depth of the device is 17.9mm and it weighs in at 1.4 kg (3.1lbs).
    The N20 is a regular laptop design, whereas the N20p will feature Lenovo’s multi-mode technology, allowing the screen to be rotated 300 degrees into a ‘stand’ mode for watching videos and interacting with others. Both workbooks will allow opening, editing and sharing Microsoft Word and Excel files, and the batteries are rated at 8 hours.
    Prices will start at $279 for the N20 and $329 for the N20p, which will be available from the beginning of July and August respectively, and I expect we might see one at Computex. Actually other sources that have been up close with the device pre-launch have reported the $329 model having 4GB DRAM, and gives the WiFi as 802.11ac.
    To put the price/specifications in perspective, a similar priced model on Amazon for $285 comes with the Haswell-based Celeron 2955U 1.4 GHz processor, 2GB DRAM and 32GB internal storage.
    Sources: Lenovo, SlashGear


    More...

  10. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,809
    Post Thanks / Like
    #3850

    Anandtech: D-Link Enters Home Automation Market with Wi-Fi Smart Plug

    In our coverage of CES 2014, we had pointed out leaks about D-Link's entry into the home automation market with a 'smart' plug controlled via Wi-Fi. Today, D-Link is making the information official.
    The D-Link Wi-Fi Smart Plug (DSP-W215) will be the first entry in a series of home automation solutions from the company. Amongst the touted features are power scheduling, local and remote control, monitoring of energy usage and overheating protection (with a thermal sensor) through automatic shutoff. Control and monitoring is made possible through the free mydlink Smart Plug mobile app.
    Gallery: D-Link DSP-W215 Wi-Fi Smart Plug


    Thanks to the FCC filings, we already have data on the internal platform. Unlike Belkin's WeMo Insight which seems to use a Ralink chipset, the DSP-W215 seems to go the Qualcomm-Atheros route with the Hornet WiSoC platform in the AR1311-AL1A, a member of the same family that is used in the Ubiquit mFi mPower units.
    Gallery: D-Link DSP-W215 Wi-Fi Smart Plug


    The WeMo Insight and the D-Link Wi-Fi Smart Plug seem to share almost the same feature set, so it would be interesting to see what the differences are in practice - for both the general consumers and the developers / power users.
    In particular, it remains to be seen if the APIs to monitor and control the unit are open and whether the unit can be controlled via a web browser and/or PC application. The unit is available today with an EDLP (every-day low price) of $50, which undercuts the MSRP of Belkin's WeMo Insight Switch by $10.


    More...

Thread Information

Users Browsing this Thread

There are currently 15 users browsing this thread. (0 members and 15 guests)

Tags for this Thread

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
Title