Page 330 of 1210 FirstFirst ... 230280305320325326327328329330331332333334335340355380430830 ... LastLast
Results 3,291 to 3,300 of 12096

Thread: Anandtech News

  1. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,809
    Post Thanks / Like
    #3291

    Anandtech: AMD Comments On Upcoming Crossfire Eyefinity Frame Pacing Driver

    For AMD owners currently wondering when AMD’s Crossfire frame pacing improvements will make their way to multi-monitor Eyefinity configurations, we finally have some additional news on the matter. While attending AMD’s 2014 GPU product showcase, we were able to pull aside AMD’s PR reps and get a comment on the matter.
    AMD is currently targeting a mid-fall release for the phase 2 fixes, phase 2 being the frame pacing improvements for Crossfire Eyefinity. Based on what we’re hearing that would be a November release, however AMD has also made it clear that they’re trying to push these fixes through as fast as they reasonably can. So if AMD can speed it up at all we’d certainly expect them to, as no one over at AMD seems to be happy that phase 2 is taking so long.
    Like phase 1, phase 2 will cover all currently supported AMD families; so the 5000, 6000, and 7000 series. It’s a generic solution that will be applicable to all of them.
    We’ll have more on the matter once AMD is ready to talk about it in full detail, hopefully in the not too distant future.











    More...

  2. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,809
    Post Thanks / Like
    #3292

    Anandtech: AMD Catalyst 13.10 Beta2 Drivers Now Available

    With the highly anticipated Battlefield 4 beta test kicking off today, AMD has pushed out their own driver update to coincide with the release of DICE’s multiplayer shooter.
    Catalyst 13.10 Beta2 is the latest iteration on AMD’s current driver branch (13.200), started back with the Catalyst 13.8 betas. New to this version along with profiles and fixes for Battlefield 4 are profile updates for Total War: Rome II, Saints Row 4, and a few other games. AMD’s release note also make mention of “AMD CrossFire frame pacing improvements for CPU-bound applications” though they do not specify what those games are.
    On the bug fix side of matters, 13.10 Beta2 fixes outstanding issues with Autodesk Inventor 2014, and some black screen sleep issues with AMD’s Enduro technology.
    As always, you can pick up the drivers over at AMD’s support website.











    More...

  3. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,809
    Post Thanks / Like
    #3293

    Anandtech: Samsung Galaxy Note 3 Review

    I still remember the first time I held the original Galaxy Note. At that point in time it wasn’t really obvious just how critical larger-display smartphones were going to be in the future, nor just how close to the smartphone market was becoming a mature one. In a mature market it’s all about filling in the niches, something Samsung has been doing since the very beginning by casting a very large form factor net with its lineup of android devices.
    I remember being intrigued with the original Note more for the active digitizer feature (S-Pen) than the large display. It was during the height of the draw something craze, and having a stylus seemed like a logical advantage. Two years I lean the other way entirely, it’s that bigger display that makes me interested in the form factor not just as a curiosity but as something I actually want to use daily.
    This is now Samsung’s third Galaxy Note, and as the adage goes hopefully third time is indeed a charm. Not that the first two weren’t wildly popular to begin with, either.










    More...

  4. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,809
    Post Thanks / Like
    #3294

    Anandtech: Samsung Galaxy Note 10.1 (2014 Edition) Review

    Ever since the arrival of the Nexus 10, it’s been hard recommending other, lower resolution 10-inch Android tablets. Although not the knock out success that the Nexus 7 became, the Nexus 10 did offer a good alternative to the iPad at a lower price. Given that Samsung made the aforementioned 10-inch Nexus, complete with 2560 x 1600 display, we wondered when a similar panel might grace Samsung’s own tablet lineup. A few weeks ago we got the answer we’ve been waiting almost a year for.
    The latest iteration of Samsung’s Galaxy Note 10.1, aptly named the 2014 Edition, ships with the firm’s own 10.1-inch 2560 x 1600 display. It’s not display alone that Samsung hopes to sell its latest Note 10.1 on, the rest of the package is similarly specced to the max.
    Unlike the Galaxy Note 3 where the majority of devices sold will likely use Qualcomm’s Snapdragon 800, the new Note 10.1 uses Samsung’s own Exynos 5420 SoC for all WiFi models. It’s only the LTE versions that will leverage Qualcomm silicon, but WiFi tablets still sell extremely well. All of this makes the 2014 Edition the first Samsung device to ship with its own Cortex A15 silicon in the US since the Nexus 10.
    Add 3GB of RAM, tick the 802.11ac box and all you’re missing is USB 3.0 from the Galaxy Note 3. The result is Samsung’s first truly high end 10.1-inch Android tablet since the Nexus 10, and as its name implies, it comes with an S Pen.










    More...

  5. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,809
    Post Thanks / Like
    #3295

    Anandtech: Gigabyte P34G First Impressions: A Thin and Light Gaming Notebook

    Over the course of a year, we look at a lot of devices and hardware; while it may require a lot of effort to publish a full review on our part, the good news is that there are many products where you don’t need to invest that much time before you can decide whether or not you’re interested in reading more.
    Take laptops as a prime example; I can walk through Best Buy, Walmart, Costco, etc. and look at all of the laptops, and in most cases I will know within minutes whether or not the laptop is worth a closer look. That’s because despite the areas where we benchmark systems (i.e. testing the CPU, GPU, battery life, LCD quality), the real factors that make one laptop good and another mediocre are things that are easily discerned without a lot of testing: display quality, the feel of the keyboard, build quality, the touchpad, the speakers, and the overall design of a laptop. Some people can be content with just about anything, but as a technology enthusiast I look for products that don’t just work but work well.
    You might think based on the above that this first impressions piece isn’t going to end well for Gigabyte, but that’s actually not the case! We’ve already reviewed the primary competition (Razer Blade 14 and MSI GE40 -- though we’re still looking for a Clevo W230ST), which are all relatively thin and light laptops with a decent amount of performance available. Gigabyte is the latecomer to the party, but the P34G has been on our radars since we first heard about it, and despite the delays we’re happy to report that we now have one in hand and are working on the review. The old cliche is “good things come to those who wait”, and that may very well be the case with the P34G.
    It’s difficult to fully experience a laptop in just a couple of days, but while I can’t necessarily say if a laptop gets everything right, I can usually suss out the problem areas quite quickly. The Razer Blade 14 for example has a low quality LCD that sticks out like a sore thumb, and the MSI GE40 display isn’t much better (though it does have the benefit of costing quite a bit less than the Blade). The Gigabyte P34G basically splits the difference with an online price of $1400, and what’s more you don’t end up with some very questionable compromises.
    As a side note, this is actually the first Gigabyte laptop we’ve had for review in quite some time. I think the last Gigabyte offering we looked at was a questionable at best netbook, and many of those budget devices ended up being rather difficult to recommend and even more tedious to review. That’s not to say that Gigabyte hasn’t been making laptops in the interim, but they haven’t been pushing for product reviews, not from us at least. After a long hiatus, and given the difficulty of getting most areas of a laptop “right” -- just look at how many established companies still seem to flub the core functionality of a laptop! -- I have to say that the P34G is one of the more impressive laptops I’ve seen in some time. Let me start with our usual look at the specifications, but the components really only tell a small part of the story.
    Gigabyte P34G Specifications
    Processor Intel Core i7-4700HQ
    (Quad-core 2.4-3.4GHz, 6MB L3, 22nm, 47W)
    Chipset HM87
    Memory 1x8GB DDR3L-1600 (11-11-11-28)
    (Second SO-DIMM slot available)

    Note: 2x8GB in test system
    Graphics GeForce GTX 760M 2GB GDDR5
    (768 cores, 627MHz + Boost 2.0, 4GHz)

    Intel HD Graphics 4600
    (20 EUs at 400-1200MHz)
    Display 14.0" Anti-Glare 16:9 1080p AHVA
    (1920x1080 AUO B140HAN01.1)
    Storage 128GB mSATA SSD (LiteOnIT LMT-128M6M)
    1TB 5400RPM HDD (TOSHIBA MQ01ABD100)
    Optical Drive N/A
    Networking 802.11n WiFi (Intel Wireless-N 7260)
    (2.4GHz 2x2:2 300Mbps capable) 
    Bluetooth 4.0 (Intel)
    Gigabit Ethernet (Realtek RTL8111/8168/8311)
    Audio Realtek HD
    Stereo Speakers
    Headset (Headphone/Mic combo) jack
    Battery/Power 6-cell, 50Wh
    120W Max AC Adapter
    Front Side N/A
    Left Side Headset Jack
    2 x USB 3.0 (1 x Charging)
    1 x VGA
    Gigabit Ethernet
    Kensington Lock
    Right Side Flash Reader (SD)
    2 x USB 2.0
    1 x HDMI
    AC Power Connection
    Back Side 2 x Exhaust Vent
    Operating System Windows 8 64-bit
    Dimensions 13.39" x 9.41" x 0.83" (WxDxH)
    (340mm x 239mm x 21mm)
    Weight 3.87 lbs (1.76kg)
    Extras 720p HD Webcam
    80-Key Backlit Keyboard
    Pricing Online: $1399 Online
    Most of the specifications are what you’d expect from a modern upper-midrange laptop, with Intel’s entry-level quad-core i7-4700HQ Haswell CPU paired with NVIDIA’s GTX 760M GPU. This is almost identical to the configuration in the MSI GE40, except with a 47W CPU that’s clocked a bit higher than the 37W i7-4702MQ. Nearly everything else is equivalent to what you’ll see elsewhere, with a standard selection of IO ports and connections.
    The real differentiator between the P34G and the GE40 and Razer Blade 14 is the LCD. Rather than a poor quality TN panel, Gigabyte equips their laptop with an AHVA (similar to IPS) 1080p panel, and it makes all the difference! I haven’t tested colors yet, but it looks good and contrast is clearly higher than 500:1 (and probably closer to 1000:1). Running at native resolution may prove a bit too much for the GPU unless you’re willing to turn down a few details (medium to high settings should prove workable), but when not gaming I find having the extra resolution is a great benefit.
    Of course, having a nice display and decent components only gets you half-way there. Clevo has frequently used all the right components but then failed to put everything into a compelling chassis, often skimping on the touchpad and other elements. Thankfully, I don’t see any major issues with these areas on Gigabyte’s laptop. The keyboard action is decent if not great, with only a slight amount of flex as you type. The backlighting is good, and the layout is mostly good -- I’m not super keen on the layout of the cursor keys, or the lack of dedicated document navigation keys, but I’ve experienced a lot worse. I still prefer the layout on the MSI GE40, but I think the backlighting on the P34G looks nicer.
    The touchpad is again in the good but not necessarily great category. Gigabyte chose to go with Elan hardware, and all the usual gestures are present (including the Windows 8 edge-swipe gestures). In practice the touchpad works well, but I have had quite a few inadvertent touchpad activations while typing this. I just cranked the palm rejection to maximum, and that seems to be helping, but it’s still not idea. Also interesting is that the touchpad isn’t a clickpad. I wasn’t entirely sold on clickpads when they first started showing up, but after almost two years of seeing clickpads the change back to a pure touchpad is noticeable. It’s not untenable or bad, but all told I prefer a good clickpad over a similar quality touchpad.
    Build quality is one of the most surprising areas for me, as again it seems so many companies utterly fail in that regard. I’m not sure if the palm rest are some magnesium alloy, aluminum, or just plastic that feels better than the cheap glossy stuff, but I think it’s some form of metal. The top cover definitely feels like it’s made of aluminum and doesn’t show much flex at all; however, it has a soft-touch coating that does show fingerprints pretty easily, and even wiping it down with the included microfiber cloth didn’t help much. The bottom of the chassis is plastic, and we’re definitely not talking about MacBook Pro build quality here (or Razer Blade quality for that matter), but it’s better than MSI’s GE40, and it’s thinner and lighter as well.
    In terms of the core functionality, then, Gigabyte did very well with the P34G. It feels reasonably sturdy, it looks nice, and I have no serious issues with the keyboard or touchpad. The display is also great and definitely makes this a laptop I can recommend, provided that it actually runs acceptably without getting too hot, and as long as battery life is acceptable.
    Full results in those areas will have to wait for the full review, and honestly I’m a little concerned with my rough estimate of battery life. I started a timer when I began typing, and 95 minutes later I’ve gone from 95% battery charge to 44% on the Balanced power profile. I haven’t been doing anything particularly taxing, though the LCD is at max brightness right now; even so, that works out to just 3-4 hours of battery life while typing in Google Drive, and that’s not very good. I did just notice that the NVIDIA GPU was set to “High Performance” instead of “Automatic”, however, so hopefully that means the GPU was active and consuming extra power. I’ll find out over the course of the week.
    Other than battery life potentially being weaker than some of the competition, so far I’ve found a lot to like with the P34G. Pricing is at least reasonable for what you get: $1400 with the 128GB SSD and a 1TB HDD is only $100 more than MSI’s GE40. Performance should be pretty much the same (slightly faster CPU for the P34G), and getting a high quality display and subjectively better build quality is definitely worth the price increase in my book.
    We’ll be back with the full review likely next week, so stay tuned. Until then, here are some pictures of the P34G. If you have any specific questions (outside of the performance metrics, which I'll be working on testing), let me know and I'll try to answer them. Likewise, if there are any aspects of the P34G you want me to test outside of our normal benchmark suite, post in the comments and I'll see what I can do.
    Gallery: Gigabyte P34G First Impressions: A Thin and Light Gaming Notebook












    More...

  6. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,809
    Post Thanks / Like
    #3296

    Anandtech: Nixeus NX-VUE27D : A 27" $450 WQHD (2560x1440) IPS LED DP-Only Monitor

    In August 2012, Nixeus launched the VUE27, a 27" WQHD (2560x1440) S-IPS LED monitor with a $430 price tag. However, the high demand led to a backlog and the monitor currently retails close to $500. The follow-up was a 30" WQXGA version priced at $700, the Nixeus VUE 30. As expected, the price has now increased to $890. By providing US-based service / warranty, they managed to win over quite a big segment of the market which was being served by eBay sellers based in Korea. However, with Monoprice getting into the game, the competition in this market has become hot. In order to counter the pricing pressure, Nixeus is introducing a new model, the NX-VUE27D. While the earlier models had a wide variety of input ports, Nixeus is making this one DisplayPort only. Fortunately, for the $450 pricing, a Mini-DisplayPort to DisplayPort cable as well as a DisplayPort cable are bundled.

    The claimed features specifications of the NX-VUE27D are as below:

    • 27" IPS LED Backlight Display Monitor
    • 2560 x 1440 WQHD
    • Compatible with Thunderbolt and DisplayPort output devices
    • 16.7 million True Colors
    • 100% sRGB Color Gamut
    • VESA Mounts 3.937" x 3.937" (100mm x 100mm)
    • Height Adjustable Base Stand with Tilt, Swivel and 90° Pivot
    • Edge to Edge Plasma Infused Glass to reduce reflection
    • Thin Bezel Design
    • 2 Year Limited Warranty
    • Brightness: 380cd/m2
    • Native Contrast Ratio: 1000:1
    • Viewing Angles:178° horizontal/178° vertical
    • Refresh Rate: 60Hz
    • Response Time: 6ms (Gray to Gray)
    • Pixel Pitch:0.233mm
    • Input Port: DisplayPort
    • Power Consumption:72 watts
    • Accessories included: Mini-DisplayPort to DisplayPort cable (for compatibility with Mac, Macbooks, and Thunderbolt devices), DisplayPort cable, Quick Set-up guide and external power supply (North America)

    The Nixeus NX-VUE27D is slated to ship on October 22, 2013, with pre-orders currently on at Amazon and Comp-U-Plus.











    More...

  7. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,809
    Post Thanks / Like
    #3297

    Anandtech: Western Digital Launches My Cloud Consumer NAS Platform

    The network attached storage market is growing by leaps and bounds. While the SMB (small and medium business) / enterprise market is driven by speed, IOPS and concurrent access support, the consumer segment is primarily driven by capacity and ease of use. Vendors have typically targeted the consumer / SOHO NAS platform with RISC-based chipsets, while Atom-based units target the higher-end SOHO and SMB market. Western Digital's SMB NAS units run Windows Storage Server (the Sentinel series), but they also have a Debian Linux platform for consumer units. We have evaluated the Linux-based My Book Live before. Running on the Applied Micro APM82181 PowerPC-based platform, the unit earned our recommendation for the extreme ease-of-use and mobile app ecosystem.
    Today, Western Digital is announcing a very ambitious update to the My Book Live. At launch, the new My Cloud lineup will have only one member. This member, a network attached hard-disk, will come in 2, 3 and 4 TB capacities priced at $150, $180 and $250 respectively. However, under the same lineup, Western Digital also plans to bring out two and four-drive configurations to the market soon.
    The My Cloud units sport a single Gigabit Ethernet connection and a dual-core processor (WD refused to disclose the identity, but it should become apparent when we receive units in hand). The units are backed by free iOS and Android apps (with direct upload from the mobile device to the NAS as the main feature) as well as well as the WD SmartWare Pro software for PC backups. Time Machine support is also available for Mac users. The units are also compatible with DLNA devices as a DMS (Digital Media Server).
    The launch of the My Cloud lineup will definitely heat up the competition in the consumer NAS segment. WD's storage background will also help in making the units hit an optimal price point. Interesting aspects to look forward to would be whether the two and four-drive units will have removable drives and/or hot-swap capabilities. We are looking forward to review one of these when the multiple-drive versions hit the market.











    More...

  8. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,809
    Post Thanks / Like
    #3298

    Anandtech: They're (Almost) All Dirty: The State of Cheating in Android Benchmarks

    Thanks to AndreiF7's excellent work on discovering it, we kicked off our investigations into Samsung’s CPU/GPU optimizations around the international Galaxy S 4 in July and came away with a couple of conclusions:
    1) On the Exynos 5410, Samsung was detecting the presence of certain benchmarks and raising thermal limits (and thus max GPU frequency) in order to gain an edge on those benchmarks, and
    2) On both Snapdragon 600 and Exynos 5410 SGS4 platforms, Samsung was detecting the presence of certain benchmarks and automatically driving CPU voltage/frequency to their highest state right away. Also on Snapdragon platforms, all cores are plugged in immediately upon benchmark detect.
    The first point applied exclusively to the Exynos 5410 equipped version of the Galaxy S 4. We did a lot of digging to confirm that max GPU frequency (450MHz) was never exceeded on the Snapdragon 600 version. The second point however applied to many, many more platforms.
    The table below is a subset of devices we've tested, the silicon inside, and whether or not they do a benchmark detect and respond with a max CPU frequency (and all cores plugged in) right away:
    I Can't Believe I Have to Make This Table
    Device SoC Cheats In
        3DM AnTuTu AndEBench Basemark X Geekbench 3 GFXB 2.7 Vellamo
    ASUS Padfone Infinity Qualcomm Snapdragon 800 N Y N N N N Y
    HTC One Qualcomm Snapdragon 600 Y Y N N N Y Y
    HTC One mini Qualcomm Snapdragon 400 Y Y N N N Y Y
    LG G2 Qualcomm Snapdragon 800 N Y N N N N Y
    Moto RAZR i Intel Atom Z2460 N N N N N N N
    Moto X Qualcomm Snapdragon S4 Pro N N N N N N N
    Nexus 4 Qualcomm APQ8064 N N N N N N N
    Nexus 7 Qualcomm Snapdragon 600 N N N N N N N
    Samsung Galaxy S 4 Qualcomm Snapdragon 600 N Y Y N N N Y
    Samsung Galaxy Note 3 Qualcomm Snapdragon 800 Y Y Y Y Y N Y
    Samsung Galaxy Tab 3 10.1 Intel Atom Z2560 N Y Y N N N N
    Samsung Galaxy Note 10.1 (2014 Edition) Samsung Exynos 5420 Y(1.4) Y(1.4) Y(1.4) Y(1.4) Y(1.4) N Y(1.9)
    NVIDIA Shield Tegra 4 N N N N N N N
    We started piecing this data together back in July, and even had conversations with both silicon vendors and OEMs about getting it to stop. With the exception of Apple and Motorola, literally every single OEM we’ve worked with ships (or has shipped) at least one device that runs this silly CPU optimization. It's possible that older Motorola devices might've done the same thing, but none of the newer devices we have on hand exhibited the behavior. It’s a systemic problem that seems to have surfaced over the last two years, and one that extends far beyond Samsung.
    Looking at the table above you’ll also notice weird inconsistencies about the devices/OEMs that choose to implement the cheat/hack/festivities. None of the Nexus do, which is understandable since the optimization isn't a part of AOSP. This also helps explain why the Nexus 4 performed so slowly when we reviewed it - this mess was going on back then and Google didn't partake. The GPe versions aren't clean either, which makes sense given that they run the OEM's software with stock Android on top.
    LG’s G2 also includes some optimizations, just for a different set of benchmarks. It's interesting that LG's optimization list isn't as extensive as Samsung's - time to invest in more optimization engineers? LG originally indicated to us that its software needed some performance tuning, which helps explain away some of the G2 vs Note 3 performance gap we saw in our review.
    Note that I’d also be careful about those living in glass houses throwing stones here. Even the CloverTrail+ based Galaxy Tab 3 10.1 does it. I know internally Intel is quite opposed to the practice (as I’m assuming Qualcomm is as well), making this an OEM level decision and not something advocated by the chip makers (although none of them publicly chastise partners for engaging in the activity, more on this in a moment).
    The other funny thing is the list of optimized benchmarks changes over time. On the Galaxy S 4 (including the latest updates to the AT&T model), 3DMark and Geekbench 3 aren’t targets while on the Galaxy Note 3 both apps are. Due to the nature of the optimization, the benchmark whitelist has to be maintained (now you need your network operator to deliver updates quickly both for features and benchmark optimizations!).
    There’s minimal overlap between the whitelisted CPU tests and what we actually run at AnandTech. The only culprist on the CPU side are AndEBench and Vellamo. AndEBench is an implementation of CoreMark, something we added in more as a way of looking at native vs. java core performance and not indicative of overall device performance. I’m unhappy that AndEBench is now a target for optimization, but it’s also not unexpected. So how much of a performance uplift does Samsung gain from this optimization? Luckily we've been testing AndEBench V2 for a while, which features a superset of the benchmarks used in AndEBench V1 and is of course built using a different package name. We ran the same native, multithreaded AndEBench V1 test using both apps on the Galaxy Note 3. The difference in scores is below:
    Galaxy Note 3 Performance in AndEBench
      AndEBench V1 AndEBench V2 running V1 Workload Perf Increase
    Galaxy Note 3 16802 16093 +4.4%
    There's a 4.4% increase in performance from the CPU optimization. Some of that gap is actually due to differences in compiler optimizations (V1 is tuned by the OEMs for performance, V2 is tuned for compatibility as it's still in beta). As expected, we're not talking about any tremendous gains here (at least as far as our test suite is concerned) because Samsung isn't actually offering a higher CPU frequency to the benchmark. All that's happening here is the equivalent of a higher P-state vs. letting the benchmark ramp to that voltage/frequency pair on its own. We've already started work on making sure that all future versions of benchmarks we get will come with unique package names.
    I graphed the frequency curve of a Snapdragon 600 based Galaxy S 4 while running both versions of AndEBench to illustrate what's going on here:
    We're looking at is a single loop of the core AndEBench MP test. The blue line indicates what happens naturally, while the red line shows what happens with the CPU governor optimization enabled. Note the more gradual frequency ramp up/down. In the case of this test, all you're getting is the added performance during that slow ramp time. For benchmarks that repeat many tiny loops, these differences could definitely add up. In situations where everyone is shipping the same exact hardware, sometimes that extra few percent is enough to give the folks in marketing a win, which is why any of this happens in the first place.
    Even when the Snapdragon 600 based SGS4 recognizes AndEBench it doesn't seem to get in the way of thermal throttling. A few runs of the test and I saw clock speeds drop down to under 1.7GHz for a relatively long period of time before ramping back up. I should note that the power/thermal profiles do look different when you let the governor work its magic vs. overriding things, which obviously also contributes to any performance deltas.
    Vellamo is interesting as all of the flagships seem to game this test, which sort of makes the point of the optimization moot:
    Any wins the Galaxy Note 3 achieves in our browser tests are independent of the CPU frequency cheat/optimization discussed above. It’s also important to point out that this is why we treat our suite as a moving target. I introduced Kraken into the suite a little while ago because I was worried that SunSpider was becoming too much of a browser optimization target. The only realistic solution is to continue to evolve the suite ahead of those optimizing for it. The more attention you draw to certain benchmarks, the more likely they are to be gamed. We constantly play this game of cat and mouse on the PC side, it’s just more frustrating in mobile since there aren’t many good benchmarks to begin with. Note that pretty much every CPU test that’s been gamed at this point isn’t a good CPU test to begin with.
    Don’t forget that we’re lucky to be able to so quickly catch these things. After our piece in July I figured one of two things would happen: 1) the optimizations would stop, or 2) they would become more difficult to figure out. At least in the near term, it seems to be the latter. The framework for controlling all of this has changed a bit, and I suspect it’ll grow even more obfuscated in the future. There’s no single solution here, but rather a multi-faceted approach to make sure we're ahead of the curve. We need to continue to rev our test suite to stay ahead of any aggressive OEM optimizations, we need to petition the OEMs to stop this madness, we need to work with the benchmark vendors to detect and disable optimizations as they happen and avoid benchmarks that are easily gamed. Honestly this is the same list of things we do on the PC side, so we've been doing it in mobile as well.
    The Relationship Between CPU Frequency and GPU Performance

    On the GPU front is where things are trickier. GFXBench 2.7 (aka GLBenchmark) somehow avoids being an optimization target, at least for the CPU cheat we’re talking about here. There are always concerned about rendering accuracy, dropping frames, etc... but it looks like the next version of GFXBench will help make sure that no one is doing that quite yet. Kishonti (the makers of GFX/GLBench) work closely with all of the OEMs and tend to do a reasonable job at keeping everyone honest but they’re in a tricky spot as the OEMs also pay for the creation of the benchmark (via licensing fees to use it). Running a renamed version of GFXBench produced similar scores to what we already published on the Note 3, which ends up being a bit faster than what LG’s G2 was able to deliver. As Brian pointed out in his review however, there are driver version differences between the platforms as well as differences in VRAM sizes (thanks to the Note 3's 3GB total system memory):
    Note 3: 04.03.00.125.077
    Padfone: 04.02.02.050.116
    G2: 4.02.02.050.141
    Also keep in mind that both LG and Samsung will define their own governor behaviors on top of all of this. Even using the same silicon you can choose different operating temperatures you’re comfortable with. Of course this is another variable to game (e.g. increasing thermal headroom when you detect a benchmark), but as far as I can tell even in these benchmark modes thermal throttling does happen.
    The two new targets are tests that we use: 3DMark and Basemark X. The latter tends to be quite GPU bound, so the impact of a higher CPU frequency is more marginalized, but with a renamed version we can tell for sure:
    Galaxy Note 3 Performance in Basemark X
      Basemark X Basemark X - Renamed Perf Increase
    On screen 16.036 fps 15.525 fps +3.3%
    Off screen 13.528 fps 12.294 fps +10%
    The onscreen differences make sense to me, it's the off screen results that are a bit puzzling. I'm worried about what's going on with the off-screen rendering buffer. That seems to be too little of a performance increase if the optimization was dropping frames (if you're going to do that, might as well go for the gold), but as to what is actually going on I'm not entirely sure. We'll keep digging on this one. The CPU optimization alone should net something around the 3% gain we see in the on screen test.
    3DMark is a bigger concern. As we discovered in our Moto X review, 3DMark is a much more balanced CPU/GPU test. Driving CPU frequencies higher can and will impact the overall scores here.
    ASUS thankfully doesn’t do any of this mess with their Padfone Infinity in the GPU tests. Note that there are still driver and video memory differences between the Padfone Infinity and the Galaxy Note 3, but we’re seeing roughly a 10% performance advantage in the overall 3DMark Extreme score (the Padfone also has a slightly lower clocked CPU - 2.2GHz vs. 2.3GHz). It's tough to say how much of this is due to the CPU optimization vs. how much is up to driver and video memory differences (we're working on a renamed version of 3DMark to quantify this exactly).
    The Futuremark guys have a lot of experience with manufacturers trying to game their benchmarks so they actually call out this specific type of optimization in their public rules:
    "With the exception of setting mandatory parameters specific to systems with multiple GPUs, such as AMD CrossFire or NVIDIA SLI, drivers may not detect the launch of the benchmark executable and alter, replace or override any parameters or parts of the test based on the detection. Period."
    If I'm reading it correctly, both HTC and Samsung are violating this rule. What recourse Futuremark has against the companies is up in the air, but here we at least have a public example of a benchmark vendor not being ok with what's going on.
    Note that GFXBench 2.7, where we don't see anyone run the CPU optimization, shows a 13% advantage for the Note 3 vs. Padfone Infinity. Just like the Exynos 5410 optimization there simply isn't a lot to be gained by doing this, making the fact that the practice is so widespread even more frustrating.
    Final Words

    As we mentioned back in July, all of this is wrong and really isn't worth the minimal effort the OEMs put into even playing these games. If I ran the software group at any of these companies running the cost/benefit analysis on chasing these optimizations vs. negativity in the press it’d be an easy decision (not to mention the whole morality argument). It's also worth pointing out that nearly almost all Android OEMs are complicit in creating this mess. We singled out Samsung for the initial investigation as they were doing something unique on the GPU front that didn't apply to everyone else, but the CPU story (as we mentioned back in July) is a widespread problem.
    Ultimately the Galaxy Note 3 doesn’t change anything from what we originally reported. The GPU frequency optimizations that existed in the Exynos 5410 SGS4 don’t exist on any of the Snapdragon platforms (all applications are given equal access to the Note 3’s 450MHz max GPU frequency). The CPU frequency optimization that exists on the SGS4, LG G2, HTC One and other Android devices, still exists on the Galaxy Note 3. This is something that we’re going to be tracking and reporting more frequently, but it’s honestly no surprise that Samsung hasn’t changed its policies here.
    The majority of our tests aren’t impacted by the optimization. Virtually all Android vendors appear to keep their own lists of applications that matter and need optimizing. The lists grow/change over time, and they don’t all overlap. With these types of situations it’s almost impossible to get any one vendor to be the first to stop. The only hope resides in those who don’t partake today, and of course with the rest of the ecosystem.
    We’ve been working with all of the benchmark vendors to try and stay one step ahead of the optimizations as much as possible. Kishonti is working on some neat stuff internally, and we’ve always had a great relationship with all of the other vendors - many of whom are up in arms about this whole thing and have been working on ways to defeat it long before now. There’s also a tremendous amount of pressure the silicon vendors can put on their partners (although not quite as much as in the PC space, yet), not to mention Google could try to flex its muscle here as well. The best we can do is continue to keep our test suite a moving target, avoid using benchmarks that are very easily gamed and mostly meaningless, continue to work with the OEMs in trying to get them to stop (though tough for the international ones) and work with the benchmark vendors to defeat optimizations as they are discovered. We're presently doing all of these things and we have no plans to stop. Literally all of our benchmarks have either been renamed or are in the process of being renamed to non-public names in order to ensure simple app detects don't do anything going forward.
    The unfortunate reality is this is all going to get a lot worse before it gets better. We wondered what would happen with the next platform release after our report in July, and the Note 3 told us everything we needed to know (you could argue that it was too soon to incite change, perhaps SGS5 next year is a better test). Going forward I expect all of this to become more heavily occluded from end user inspection. App detects alone are pretty simple, but what I expect to happen next are code/behavior detects and switching behavior based on that. There are thankfully ways of continuing to see and understand what’s going on inside these closed platforms, so I’m not too concerned about the future.
    The hilarious part of all of this is we’re still talking about small gains in performance. The impact on our CPU tests is 0 - 5%, and somewhere south of 10% on our GPU benchmarks as far as we can tell. I can't stress enough that it would be far less painful for the OEMs to just stop this nonsense and instead demand better performance/power efficiency from their silicon vendors. Whether the OEMs choose to change or not however, we’ve seen how this story ends. We’re very much in the mid-1990s PC era in terms of mobile benchmarks. What follows next are application based tests and suites. Then comes the fun part of course. Intel, Qualcomm and Samsung are all involved in their own benchmarking efforts, many of which will come to light over the coming years. The problem will then quickly shift from gaming simple micro benchmarks to which “real world” tests are unfairly optimized which architectures. This should all sound very familiar. To borrow from Brian’s Galaxy Gear review (and BSG): “all this has happened before, and all of it will happen again.”











    More...

  9. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,809
    Post Thanks / Like
    #3299

    Anandtech: Intel Announces Galileo: Quark Based Arduino Compatible Developer Board

    At this year’s IDF Intel announced its third major microarchitecture family: Quark. Before Quark we had Core at the high-end and Atom for smartphones/tablets/cheap PCs. Quark adds a third vector, below Atom, with a focus on even lower power, more cost sensitive markets (e.g. low power embedded).
    Intel finds itself in an interesting position today. When it first launched Atom, x86 compatibility was a selling point - something no competing ARM solution at the time could offer. These days the bulk of the mobile world is built on ARM code. Similarly, because of ARM’s excellent portfolio of super cheap, low power cores, there are many other markets where ARM is just as prevalent. Add onto that some of the lowest cost platforms to develop on and do neat things with run ARM based silicon, and not x86. In other words, there’s an entirely new generation of platforms, developers and applications that aren’t x86 compatible. Over the long run this poses a big problem to Intel. While x86 might not be an advantage in a lot of high growth markets, it’s still an advantage in many others. Any erosion of that advantage simply puts Intel in a much more difficult position in the long run.
    The solution, albeit a bit late, is Quark. The design is 32-bit Pentium ISA compatible (Intel apparently loves starting out new projects with the Pentium ISA), and features a core that should be roughly 1/5 the size of Atom and capable of operating at as little as 1/10 the power. Quark's other major selling point is it is a fully synthesized design. It'll be built exclusively at Intel fabs to start, but Intel made it very clear that if you want a cheap, low power x86 core to integrate alongside your own IP, it'll offer you Quark. Previously Intel provided no such solution, which drove some customers to ARM. You could even speculate on what this means for Intel's strategy as being even more of a player in the foundry space.
    Today Intel is announcing a microcontroller board based on the Quark X1000 SoC called Galileo. The Quark implementation on the board is a single-core running at 400MHz (single speed, there’s no speedstep equivalent here). There’s a 16KB L1 cache and 512KB on-die embedded SRAM.

    The board features a 10/100 Ethernet, mini-PCIe slot (PCIe gen 2), USB 2 host controller, USB client connector, JTAG header and 256MB of DRAM. Galileo also features an 8MB SPI Flash for firmware/bootloader/sketch storage. MicroSD card support is optional. Galileo measures 4.2 inches long by 2.8 inches wide.
    The other big feature of Galileo is that it is compatible with Arduino software and shields, making it a great target for students and educators in the maker scene.

    It’s good to see Intel doing this sort of stuff, as it's extremely important to get early exposure to x86 among maker enthusiasts if Intel wants to keep x86 around in the long run (although I would’ve liked to have seen it a few years ago). Intel will be giving away 50,000 Galileo boards to 1000 universities worldwide over the next year and a half or so to spark development.
    Gallery: Intel Announces Galileo: Quark Based Arduino Compatible Developer Board













    More...

  10. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,809
    Post Thanks / Like
    #3300

    Anandtech: Choosing a Gaming CPU October 2013: i7-4960X, i5-4670K, Nehalem and Intel

    Back in April we launched our first set of benchmarks relating to which CPU we should choose for gaming.  To that list we now add results from several Intel CPUs, including the vital data point of the quad core i5-4670K, some other Haswell CPUs, the new extreme i7-4960X processor and some vintage Nehalem CPUs we could not get hold of for the first round of results.
    Many thanks go to GIGABYTE for the loan of the Haswell+Nehalem CPUs for this update and for use of an X58A-UD9.










    More...

Thread Information

Users Browsing this Thread

There are currently 17 users browsing this thread. (0 members and 17 guests)

Tags for this Thread

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
Title