Page 488 of 1210 FirstFirst ... 388438463478483484485486487488489490491492493498513538588988 ... LastLast
Results 4,871 to 4,880 of 12094

Thread: Anandtech News

  1. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,805
    Post Thanks / Like
    #4871

    Anandtech: AMD Dives Deep On Asynchronous Shading

    Earlier this month at GDC, AMD introduced their VR technology toolkit, LiquidVR. LiquidVR offers game developers a collection of useful tools and technologies for adding high performance VR to games, including features to make better utilization of multiple GPUs, features to reduce display chain latency, and finally features to reduce rendering latency. Key among the latter features set is support for asynchronous shaders, which is the ability to execute certain shader operations concurrently with other rendering operations, rather than in a traditional serial fashion.
    It’s this last item that ended up kicking up a surprisingly deep conversation between myself, AMD’s “Chief Gaming Scientist” Richard Huddy, and other members of AMD’s GDC staff. AMD was keen to show off the performance potential of async shaders, but in the process we reached the realization that to this point AMD hasn’t talked very much about their async execution abilities within the GCN architecture, particularly within a graphics context as opposed to a compute context. While the idea of async shaders is pretty simple – executing shaders concurrently (and yet not in sync with) other operations – it’s a bit less obvious just what the real-world benefits are why this matters. After all, aren’t GPUs already executing a massive number of threads?
    With that in mind AMD agreed it was something that needed further consideration, and after a couple of weeks they got back to us (and the rest of the tech press) with further details of their async shader implementation. What AMD came back to us with isn’t necessarily more detail on the hardware itself, but it was a better understanding of how AMD’s execution resources are used in a graphics context, why recent API developments matter, and ultimately why asynchronous shading/computing is only now being tapped in PC games.
    Why Asynchronous Shading Wasn’t Accessible Before

    AMD has offered multiple Asynchronous Compute Engines (ACEs) since the very first GCN part in 2011, the Tahiti-powered Radeon HD 7970. However prior to now the technical focus on the ACEs was for pure compute workloads, which true to their name allow GCN GPUs to execute compute tasks from multiple queues. It wasn’t until very recently that the ACEs became important for graphical (or rather mixed graphics + compute) workloads.
    Why? Well the short answer is that in another stake in the heart of DirectX 11, DirectX 11 wasn’t well suited for asynchronous shading. The same heavily abstracted, driver & OS controlled rendering path that gave DX11 its relatively high CPU overhead and poor multi-core command buffer submission also enforced very stringent processing requirements. DX11 was a serial API through and through, both for command buffer execution and as it turned out shader execution.

    As one might expect when we’re poking holes into DirectX 11, the asynchronous shader issues of the API are being addressed in Mantle, DirectX 12, and other low-level APIs. Along with making it much easier to submit work from multiple threads over multiple cores, all of these APIs are also making significant changes in how work is executed. With the ability to accept work from multiple threads, work can now be more readily executed in parallel and asynchronously, enabling asynchronous shading for the first time.

    There is also one exception to the DX11 rule that we’ll get to in depth a bit later, but in short that exception is custom middleware like LiquidVR. Even in a DX11 context LiquidVR can leverage some (but not all) of the async shading functionality of GCN GPUs to do things like warping asynchronously, as it technically sits between DX11 and the GPU. This in turn is why async shading is so important to AMD's VR plans, as all of their GCN GPUs are capable of this and it can be exposed in the current DX11 ecosystem.
    Executing Async: Hardware & Software

    Of course to pull this off you need hardware that can support executing work from multiple queues, and this is something that AMD invested in early. GCN 1.0 and GCN 1.1 Bonaire included 1 graphics command processor and 2 ACEs, while GCN 1.1 Hawaii and GCN 1.2 Tonga (so far) include 1 graphics command processor and 8 ACEs. Meanwhile the GCN-powered Xbox One and Playstation 4 take their own twists, each packing different configurations of graphics command processors and ACEs.

    From a feature perspective it’s important to note that the ACEs and graphics command processors are different from each other in a small but important way. Only the graphics command processors have access to the full GPU – not just the shaders, but the fixed function units like the geometry units and ROPs – while the ACEs only get shader access. Ostensibly the ACEs are for compute tasks and the command processor is for graphics tasks, however with compute shaders blurring the line between graphics and compute, the ACEs can be used to execute compute shaders as well now that software exists to make use of it.
    On a side note, part of the reason for AMD's presentation is to explain their architectural advantages over NVIDIA, so we checked with NVIDIA on queues. Fermi/Kepler/Maxwell 1 can only use a single graphics queue or their complement of compute queues, but not both at once – early implementations of HyperQ cannot be used in conjunction with graphics. Meanwhile Maxwell 2 has 32 queues, composed of 1 graphics queue and 31 compute queues (or 32 compute queues total in pure compute mode). So pre-Maxwell 2 GPUs have to either execute in serial or pre-empt to move tasks ahead of each other, which would indeed give AMD an advantage..
    GPU Queue Support
    Graphics/Mixed Mode Pure Compute Mode
    AMD GCN 1.2 (285) 1 Graphics + 7 Compute 8 Compute
    AMD GCN 1.1 (290 Series) 1 Graphics + 7 Compute 8 Compute
    AMD GCN 1.1 (260 Series) 1 Graphics + 1 Compute 2 Compute
    AMD GCN 1.0 (7000/200 Series) 1 Graphics + 1 Compute 2 Compute
    NVIDIA Maxwell 2 (900 Series) 1 Graphics + 31 Compute 32 Compute
    NVIDIA Maxwell 1 (750 Series) 1 Graphics 32 Compute
    NVIDIA Kepler GK110 (780/Titan) 1 Graphics 32 Compute
    NVIDIA Kepler GK10x (600/700 Series) 1 Graphics 1 Compute
    Moving on, coupled with a DMA copy engine (common to all GCN designs), GCN can potentially execute work from several queues at once. In an ideal case for graphics workloads this would mean that the graphics queue is working on jobs that require its full hardware access capabilities, while the copy queue handles data management, and finally one-to-several compute queues are fed compute shaders. What each of those task precisely is depends on the game developer, but examples of graphics and compute tasks include shadowing and MSAA on the former, and ambient occlusion, second-order physics, and color grading on the latter.

    As a consequence of having multiple queues to feed the GPU, it is possible for the GPU to work on multiple tasks at once. Doing this seems counter-intuitive at first – GPUs already work on multiple threads, and graphics rendering is itself embarrassingly parallel, allowing it to be easily broken down into multiple threads in the first place. However at a lower level GPUs only achieve their famous high throughput performance in exchange for high latency; lots of work can get done, but relatively speaking any one thread may take a while to reach completion. For this reason the efficient scheduling of threads within a GPU requires an emphasis on latency hiding, to organize threads such that different threads are interleaved to hide the impact of the GPU’s latency.
    Latency hiding in turn can become easier with multiple work queues. The additional queues give you additional pools of threads to pick from, and if the GPU is presented with a situation where it absolutely can’t hide latency from the graphics queue and must stall, the compute queues could be used to fill that execution bubble. Similarly, if there flat-out aren’t enough threads from the graphics queue to fill out the GPU, then this presents another opportunistic scenario to execute threads from a compute task to keep the GPU better occupied. Compared to a purely serial system this also helps to mitigate some of the overhead that comes from context switching.
    Ultimately the presence of the ACEs and the layout of GCN allows these tasks to be done in an asynchronous manner, ties into the concept of async shaders and what differentiates this from synchronous parallel execution. So long as the task can be done asynchronously, then GCN’s scheduler can grab threads as necessary from the additional queues and load them up to improve performance. Meanwhile, although the number of ACEs can impact how well async shading is able to improve performance by better filling the GPU, AMD readily admits that 8 ACEs is likely overkill for graphics purposes; even a fewer number of queues (e.g. 1+2 in earlier GCN hardware) is useful for this task, and the biggest advantage is simply in having multiple queues in the first place.
    The Performance Impact of Asynchronous Shaders

    Execution theory aside, what is the actual performance impact of asynchronous shaders? This is a bit harder of a question to answer at this time, though mostly because there’s virtually nothing on the PC capable of using async shaders due to the aforementioned API limitations. Thief, via its Mantle renderer, is the only PC game currently using async shaders, while on the PS4 and its homogenous platform there are a few more titles making using of the tech.

    AMD for their part does have an internal demo showcasing the benefits of async shaders, utilizing a post-process blurring effect with and without async shaders, and the performance differences can be quite high. However it’s a synthetic demo, and like all synthetic demos the performance gains represent something of a best-case scenario for the technology. So AMD’s 46% performance improvement, though quite large, is not something we’d expect to see in any game.
    That said, VR (and by extension, LiquidVR) presents an interesting and more straightforward case for the technology, which is why both NVIDIA and AMD have been pursuing it. Asynchronous execution of time warping and other post-processing effects will on average reduce latency (filling those rendering bubbles), with time warping itself reducing perceived latency by altering the image at the last possible second, while the async execution reduces the total amount of time a frame is in the GPU being rendered. The actual latency impact will again not be anywhere near the 46% performance improvement in AMD’s sample, but in the case of VR every millisecond counts.
    Of course to really measure this we will need games that can use async shaders and VR hardware – both of which are in short supply right now – but the potential benefits are clear. And if AMD has their way, both VR and regular developers will be taking much greater advantage of the capabilities of asynchronous shading.

    More...

  2. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,805
    Post Thanks / Like
    #4872

    Anandtech: Intel Braswell Details Quietly Launched: Cherry Trail and Airmont on 14nm

    The 14nm tri-gate for process from Intel has currently been seen in both Core M (Broadwell-Y) and Broadwell-U, with some discussions at Mobile World Congress regarding Atom x5 and Atom x7 both featuring 14nm cores at their heart. For the mini-PC and laptop space, Core M fits nicely with a 4.5W TDP and the Core architecture, however Intel’s Atom line also occupies a similar segment but at a lower price point. The upgrade from Bay Trail is Cherry Trail, from 22nm Silvermont cores to 14nm Airmont cores. Technically it would seem that Cherry Trail is a catch-all name with the SoCs intended for mini-PCs will also ride under the name ‘Braswell’, using up to four Atom cores and Generation 8 graphics within a 4-6W TDP.
    CPU World recently published details of four Braswell SKUs. For Braswell, similar to Bay Trail, Intel designs its Atom SoCs in terms of dual core modules, where each core is separate apart from a shared L2 cache. The SoC then puts one or two of these modules on die (for two or four cores) without an overriding L3 cache. The Braswell SoCs will support DDR3-1600 memory, with SIMD instructions up to SSE4 with support for VT-x and Burst Performance Technology offering higher clocks for extremely short periods when required.
    The four SoCs are presented as follows:
    Intel Braswell SKUs
    SKU Cores /
    Threads
    CPU
    Freq
    CPU
    Burst
    L2
    Cache
    TDP Price
    Celeron N3000 2 / 2 1040 2080 1 MB 4W $107
    Celeron N3050 2 / 2 1600 2160 1 MB 6W $107
    Celeron N3150 4 / 4 1600 2080 2 MB 6W $107
    Pentium N3700 4 / 4 1600 2400 2 MB 6W $161
    This is similar to elements of both the Bay Trail-M (Mobile) the Bay Trail-D (Desktop) product line, which would perhaps mean that we will see both inside mini-PCs as well as some laptop designs, such as Chromebooks. In the current Braswell list there are two dual core models and two quad core models, although in the Bay Trail-D line there are six in total with four Celeron and two Pentium. The Celeron N3000 from the Braswell line is an interesting element to consider, especially when we compare it against the similar TDP of the Core M 5Y10.
    Celeron N3000 Core M 5Y10
    Architecture Airmont Broadwell
    Cost $107 $281
    Cores / Threads 2 / 2 2 / 4
    Base Frequency (MHz) 1040 800
    Turbo / Burst (MHz) 2080 2000
    L2 Cache 1 MB 0.5 W
    L3 Cache - 4 MB
    TDP 4 W 4.5 W
    GPU '
    Gen 8'
    HD 5300
    Execution Units
    Unknown ?
    24
    GPU Frequency / MHz
    320-600 ?
    100-800
    DRAM
    DDR3-1600
    DDR3/L-1600
    Ultimately Intel’s differentiation lies with the architecture and price, meaning Core costs more, and historically this also correlates with performance. That being said, Core M is susceptible to both cTDP-Up and cTDP-Down depeding on how the OEM wants to use it. Braswell may be in a similar position, although we do not have confirmation of this as of yet.
    It will be interesting to see what applications for Braswell will be released first. I would imagine everything we currently see in Bay Trail-D form should get an upgrade. We have already seen shots of ECS’ roadmap for the LIVA which states a Q2 2015 launch for example.
    Source: CPU World


    More...

  3. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,805
    Post Thanks / Like
    #4873

    Anandtech: Microsoft Announces Surface 3: 10.8-inch 2-in-1 with Atom x7 on 14nm from

    The next element of Microsoft’s Surface line is here, and the anticipated Surface 3 throws up a couple of (nice) surprises. Starting at $499, the Surface 3 will complement the Surface Pro 3 by offering a 10.8-inch device in a 1920x1280 resolution. That sounds a little odd being a bit more than full-HD, but offers a 3:2 resolution like the larger Surface Pro 3. Under the hood is Intel’s new Atom x7 which we discussed briefly during the Atom re-naming launch earlier this year, which means a 14nm class device featuring Airmont cores and the direct upgrade from Silvermont and Bay Trail. The release states that this is the high end model, which would suggest a quad-core Atom design running above 2 GHz. Microsoft/Intel are not directly calling this Cherry Trail, and our discussions with Intel seem to avoid the Cherry Trail nomenclature, but the SoC will be partnered with 64GB or 128GB of storage, plus a 4G ‘LTE Ready’ version will be coming later.
    The Surface 3 is being billed by Microsoft as the thinnest and lightest Surface device, and will run the full Windows 8.1 inside which can be upgraded to Windows 10 later this year for free. The price will include a 1-year subscription to Office 365, as well as 1TB of OneDrive storage. On the device will be a full-size USB 3.0 port, a mini-DisplayPort and a microSD card reader to supplement storage. Charging comes via a bundled fast-charging micro-USB, although it can also be charged with a standard smartphone micro-USB as well. Battery life is listed as 10 hours for video playback, with the screen being described as having ‘incredibly accurate colors’ – here’s hoping for a calibrated display out of the box. Front and rear cameras (3.5MP / 8MP) are designed to both capture 1080p, with an auto-focus feature on the rear camera.
    The device on its own will be 8.7mm thin, weighing in at 622 grams (1.37 pounds), and seems to feature the kickstand that Anand liked in his Surface Pro 3 review. Accessories start with the standard Type Cover but also include a Docking Station with more USB ports as well as ‘The Surface Pen’. The new digital pen will be available in red, blue, black and silver with 256 levels of pressure sensitivity - we presume this is an N-Trig design although we’re waiting for official confirmation.
    The Surface 3 and accessories are now available for pre-order in the US, shipping on May 5th. Resellers and partners should have availability on May 7th, although from 1st April users should be able to head into a Microsoft Store in Canada, Puerto Rico and the United States for some hands on time before full launch.
    We’ve already put in our request for a review unit.
    Source: Microsoft
    Gallery: Microsoft Announces Surface 3: 10.8-inch 2-in-1 with Atom x7 on 14nm from $499


    Microsoft Surface 3
    Size 10.52 x 7.36 x 0.34-inch
    267 x 187 x 8.7-mm
    Weight 1.37 lbs - 622 g
    Display 10.8-inch ClearType Full HD Plus
    1920x1280 resolution, 3:2 ratio
    10-point multi-touch
    Surface Pen Support
    Battery Life Up to 10 hours (video playback)
    Storage/DRAM 64GB / 2GB
    128GB / 4GB
    CPU Atom x7-Z8700
    Quad Core 14nm
    1.6 GHz Base Frequency
    2.4 GHz Burst Frequency
    WiFi 802.11ac + BT 4.0
    LTE Models at a later date
    Ports USB 3.0, Mini-DisplayPort, microSD,
    Micro USB charging, 3.5mm Headset Jack
    Software Windows 8.1
    Office 365 Personal with 1TB OneDrive (1-year)
    Front Camera 3.5 MP
    Rear Camera 8.0 MP with Autofocus
    Warranty 1-year limited
    Price $499
    $599


    More...

  4. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,805
    Post Thanks / Like
    #4874

    Anandtech: The Samsung SSD 850 EVO mSATA/M.2 Review

    Four months ago Samsung introduced the world to TLC V-NAND in the form of SSD 850 EVO. It did well in our tests and showed that 3D NAND technology essentially brings TLC NAND to the level where planar MLC NAND stands today. The initial launch only included the most popular form factor in 2.5", but did not address the upgrade market where mSATA and M.2 are constantly growing in popularity. With today's release, Samsung is expanding the 850 EVO lineup with M.2 and mSATA models.

    More...

  5. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,805
    Post Thanks / Like
    #4875

    Anandtech: CTL Launches New Chromebook For Education

    The education sector is one area where Google’s Chromebook has proved very popular. Relatively inexpensive devices, which are easier to manage, and include just a lightweight operating system, have certainly gained a foothold there. School Divisions which have bought into the Google Apps ecosystem would seem to have an easy decision to move to Chrome OS.
    There are several companies which specialise in the education sector. CTL is one of those companies, and today they are launching a new even lower cost entry into the Chrome OS education market. The CTL Chromebook J2 and J4 for Education both feature the quad-core Cortex A17 based processor, and in this case it is the RK3288 made by Rockchip. The differentiation is laid out in the name, with the J2 featuring 2 GB of memory, and the J4 having 4 GB of RAM.
    As these are aimed at the less than forgiving student population, they are available with a three year warranty with accidental damage coverage. Also, of interest to the sector, they will come with one year of Securly content filtering and analysis, so that schools and parents can set automatic filters for approved sites. In addition, they can be bundled with the Chrome Device Management licenses, Hapara licenses, and Pearson Education Software and eTextbooks.
    These are low cost devices, and as such are outfitted with some low cost components. The J2 starts at $179, or $199 with the Chrome Device Management license. The J4 bumps the price to $209 and $229 respectively. Both feature an 11.6 inch 1366x768 matte display, 16 GB of eMMC storage, and a 1.3 MP webcam.
    CTL lists the new Chromebook at over nine hours of battery life though, which should be adequate for most school tasks. The device is relatively thin and light too, with it coming in at just 2.46 lbs (1.12 kg) and they feature an HDMI port, two USB 2.0 ports, a micro SD card slot, and 802.11ac wireless.
    While these will not be the fastest devices available with Chrome OS (for that a school would have to purchase the new Pixel) getting the price down should help out with school budgets.
    For those in education who want to check out the new devices, the CTL Chromebook for Education J2 and CTL Chromebook for Education J4 can be sourced from www.ctl.net.
    Source: CTL


    More...

  6. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,805
    Post Thanks / Like
    #4876

    Anandtech: LG 34UM67: UltraWide FreeSync Review

    AMD officially launched FreeSync earlier this month, and the technology is interesting not just in how it works but also in how it differs from NVIDIA’s G-SYNC. Our first FreeSync display comes by way of LG, and it boasts an IPS display with an UltraWide 2560x1080 resolution. For gamers there are certainly benefits to discuss, but there are also some problem areas. How does this FreeSync display stack up against other gaming monitors, and how does it fare outside of gaming? Read on for our full review.

    More...

  7. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,805
    Post Thanks / Like
    #4877

    Anandtech: Google Announces $149 Chromebooks and the Chromebit Chrome OS Stick

    Today Google announced a number of new Chrome OS products that will be available in the future from their OEM partners. The main focus of all these devices appears to be pushing the price of Chrome OS devices even lower so that they become accessible to more people.
    The first two devices announced are the Haier Chromebook 11 and the Hisense Chromebook. Both of these laptops have 11.6" 1366x768 displays, 16GB of eMMC storage, 2GB of DDR3L memory, and surprisingly, 2x2 802.11ac WiFi. The main aspect that they differ on is their processors, and subsequently, their battery life. The Haier Chromebook 11 uses a Rockchip RK3288 SoC which has four Cortex A17 cores with a max frequency of 1.8GHz, and a 600MHz ARM Mali-T764 GPU. It advertises a battery life of up to 10 hours. The Hisense Chromebook also uses the Rockchip RK3288, but despite using the same name as the chip in the Haier Chromebook, it has a max CPU frequency of 2.5GHz. Hisense advertises a battery life of up to 8.5 hours. Both of these devices are sure to be popular with educational institutions and anyone looking for a very inexpensive machine to browse the web on.
    Possibly the more interesting announcement of the day is the Chromebit. There's very little information about specifications, but the Chromebit is essentially a Chrome OS computer on a stick which can be connected to a display and other peripherals to be used as a computer. The Chromebit will be launching in the summer of this year for less than $100, and we'll likely see more concrete pricing and information about specifications as we approach closer to its release date.


    More...

  8. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,805
    Post Thanks / Like
    #4878

    Anandtech: PlayStation Plus April 2015 Free Games Preview

    April has come and with it, Sony is bringing us their standard bevy of games for PlayStation Plus members. Sony, as usual, is offering six games in total, with two specifically aimed at each platform but with cross buy available on some titles.
    PlayStation 4

    Tower of Guns



    The first game for the PlayStation 4 is Tower of Guns from developer Terrible Posture Games. This is a single-player first-person shooter which is set in… wait for it… a tower full of guns! This shooter was released in March 2014 on the PC and has been ported to the consoles. It has randomized levels for a new take every time you play. The game was released to above average reviews, with it scoring a 77 Metascore and 7.3 User Score on metacritic. Tower of Guns normally retails for 14.99 on the PC and should be around that price on the PlayStation Store as well, and will be cross buy with the PS3.
    “A game that needs little introduction. Why? Because it’s a tower filled with guns! Over-the-top in all the right ways, try to survive this tower with random enemies, bosses, power-ups, and a boatload of bullets.”
    Never Alone

    Developed by Upper One Games, Never Alone is a puzzle-platformer set in an Inuit theme, where the main characters are an Iñupiat girl named Nuna and an Arctic Fox. Players swap control between the two characters to solve the levels and advance. It was recently released on the PS4, with a launch date of November 2014. Despite the artistic graphical style, Never Alone was met with mixed reviews. It scored a 73 Metascore and 7.0 User Score on metacritic. Never Alone normally sells for $14.99 on The PS Store.
    “This beautiful puzzle adventure tasks you with guiding Nuna, a native Alaskan girl, through breath-taking environments to save her village. Also, you get to travel with a fox! Can’t get much better than a pet fox.”
    PlayStation 3

    Dishonored

    Originally released in October 2012, Dishonored is a stealth action game from Arkane Studios. Players control Corvo Attano, who is framed for a murder, and takes revenge against his conspirators. It has a strong emphasis on stealth. Being published by Bethesda Softworks has brought this game a lot of attention, and it was met with strong reviews. Dishonored has an 89 Metascore and 7.8 User Score on metacritic, and normally sells for $19.99.
    “Players take control of Corvo Attano, a bodyguard framed for murder and imbued with powerful abilities to seek revenge. This game is all about player choice, and feeling awesome while executing extreme stealth maneuvers.”
    Aaru’s Awakening

    Aaru’s Awakening is a hand drawn 2D action platformer game from the Icelandic developer Lumenox Games. This game is set in the dreamy world of Lumenox, and the goal of the creator was to set up a euphoric and dreamy experience with the artistic style. It was first released only a month ago in February 2015 on the PC, and is now making its way to the PlayStation. Unfortunately the nice visuals have not translated into a well-received game, with this Indie game getting only a 60 Metascore and 3.4 User Score on metacritic. As it is new to the store, pricing is not set yet but expect it to be around $14.99 as it is on Steam. It will also be available on the PS4.
    “What started as a school project for a team of students in Iceland is now a gorgeous platformer about navigating treacherous terrain with well-timed teleports.”
    PlayStation Vita

    Killzone: Mercenary

    The first game for the Vita is Killzone: Mercenary, from developer Guerrilla Cambridge and released in September 2013. This is the second handheld game set in the Killzone series, and the first-person shooter has the player following Mercenary Arran Danner through a backdrop of an interstellar war. It has received good reviews, scoring a 78 Metascore and 8.9 User Score on metacritic, and normally sells for $35.99.
    “View the iconic war against the Helghast from a mercenary’s point of view, and jump into a robust multiplayer mode to show how good (Or bad?) you actually are.”
    MonsterBag

    In a big departure in genre from the previous game, the second game for the Vita is MonsterBag from Iguana Bee. This is another new game to the store, and will be available on April 7th. This game is about a little blue monster called V, and is set as a puzzle platformer. V has some telekinetic skills, and is on a journey to get back to his friend Nia. Being an unreleased game, there is not much info on how the game plays or what it will sell for, but if you are a PS Plus subscriber, at least you know you won’t already have this interesting looking game.
    “An adorable puzzle game about a bag-shaped monster named V trying to reach his friend Nia without scaring the pants off of people. May or may not include a battle of wits, skill, and the inevitable apocalypse.”
    There we have it. Six more games from Sony to keep subscribers busy for the month of April. If you are too enamored with the spring weather, pick these up and save them for the depths of winter.
    Source: PlayStation Blog



    More...

  9. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,805
    Post Thanks / Like
    #4879

    Anandtech: AMD To Face Securities Fraud Lawsuit

    In a bit of news that’s unfortunately not an April Fool’s joke, a US District Court has ruled that AMD must face claims from investors over potential securities fraud committed by the company.
    At the heart of the matter is AMD’s Llano APU. Launched in 2011, in Q3 of 2012 AMD had to take an inventory write-down of $100 million on unsold Llano inventory, as the company had to further reduce prices on the chips in order to sell them in the face of competition from Intel along with the ramp-up of their own Trinity APUs. The writedown in this case did not directly cost the company $100M, but it essentially reduced the value of the company by that much to AMD’s shareholders, whose stock in turn suffered a hit in value.
    What makes this writedown lawsuit material are the events that led up to it and how AMD handled it. The participating investors are accusing AMD of committing securities fraud over how they presented the state of Llano production. The suit claims that Llano production was not as strong as AMD was claiming – a consequence of supply issues with GlobalFoundries’ 32nm process – and as a result AMD artificially inflated the value of the company in 2011 and 2012, and in the process produced too many Llano chips once GlobalFoundries was finally able to catch up. This in turn led to AMD’s $100M writedown and overall decline in value of the company and its stock price (with AMD losing about ¾ of its peak value in 2012).
    These types of lawsuits are not particularly uncommon, especially as institutional investors seek restitution for money they lost from the drop in stock price. That said, today’s ruling is only over whether the lawsuit can go to trial and not over the validity of the claims themselves, never mind what specifically the investors are asking for. So it is likely that the actual lawsuit will take quite a bit longer to resolve.


    More...

  10. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,805
    Post Thanks / Like
    #4880

    Anandtech: HTC Launches the One M8s

    Today HTC is announcing a new smartphone that sits somewhere between their mid-range smartphones and their flagship ones. This new phone is the One M8s, and even with a quick glance you will notice that it looks very similar to HTC's previous flagship smartphones, the HTC One M8. In fact, the two devices are essentially identical in terms of their appearance and construction, with HTC's official specifications showing only a 0.2mm difference in maximum thickness between the two. Despite being nearly visually identical, the specifications of the M8 and the M8s differ in several ways, and I've compared them in the chart below.
    HTC One M8s HTC One (M8)
    SoC Qualcomm Snapdragon 615, 4 x Cortex A53 at 1.7GHz + 4 x Cortex A53 at 1.0GHz MSM8974ABv3 2.26 GHz Snapdragon 801
    RAM/NAND 2 GB LPDDR3, 16GB NAND + microSD 2GB LPDDR3, 16/32GB NAND + microSD
    Display 5” 1080p LCD
    Network 2G / 3G / 4G LTE (Qualcomm MDM9x25 UE Category 4 LTE)
    Dimensions 146.36 x 70.6 x 9.55mm max, 160 grams 146.36 x 70.6 x 9.35mm max, 160 grams
    Camera 13 MP F/2.0 Rear Facing, 28mm (35mm equiv) and rear depth camera, 5MP
    F/2.8 Front Facing
    4.0 MP Rear Facing with 2.0 µm pixels, 1/3" CMOS size, F/2.0, 28mm (35mm equiv) and rear depth camera, 5MP F/2.0 FFC
    Battery 2840 mAh (10.79Wh) 2600 mAh (9.88Wh)
    OS Android 5.0 with HTC Sense Android 5.0 with HTC Sense
    Connectivity 802.11a/b/g/n/ac + BT 4.1, USB2.0, GPS/GNSS, NFC 802.11a/b/g/n/ac + BT 4.0, USB2.0, GPS/GNSS, NFC
    SIM Size NanoSIM
    At least on paper the M8s shares its chassis, cellular connectivity, and display with the M8, although it remains to be seen if it indeed uses the exact same LCD panel. WiFi and Bluetooth connectivity is also similar, although the M8s ships with Bluetooth 4.1 capable firmware out of the box. Beyond these specifications is where the differences between the two devices begin to arise.
    The first major difference is obviously the SoC. While the One M8 used Qualcomm's Snapdragon 801 with four Krait cores at 2.26GHz, the M8s opts for Qualcomm's Snapdragon 615 which has two clusters of four Cortex A53 cores at 1.7GHz and 1.0GHz respectively. It will be interesting to see how Snapdragon 615 compares to 801 with regards to performance as well as power consumption. On that note, the M8s is also able to fit a higher capacity battery in the same chassis as the original M8.
    The last major difference between the two are the cameras. While the One M8 used HTC's 4MP "UltraPixel" sensor, the M8s opts for a 13MP sensor. Despite this, it retains the secondary depth camera which enables HTC's duocam post processing effects like depth of field and refocusing photos.
    The comparison to the original One M8 is really just to show how the two devices differ inside since they share the same appearance. While the M8 was HTC's flagship smartphone, the M8s is not quite at that level. It seems to sit somewhere between the HTC Desire 826 and the HTC One M9, with a more premium design than the 826 but very similar specifications. As far as availability is concerned, the HTC One M8s will be launching in several parts of Europe. Pricing will vary from region to region, but the cost in the United Kingdom has been confirmed at £379.99 outright which gives some idea as to how much the HTC One M8s will cost compared to the One M9 and Desire 826 that sit above and below it respectively.


    More...

Thread Information

Users Browsing this Thread

There are currently 21 users browsing this thread. (0 members and 21 guests)

Tags for this Thread

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
Title