Page 462 of 1210 FirstFirst ... 362412437452457458459460461462463464465466467472487512562962 ... LastLast
Results 4,611 to 4,620 of 12095

Thread: Anandtech News

  1. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,807
    Post Thanks / Like
    #4611

    Anandtech: Intel Announces Curie: Tiny Module for Wearables

    Yesterday, Intel announced a new module targeted to wearable technology: Intel Curie. This module continues Intel's push into IoT and wearable technology started with the Intel Edison. While we thought Edison was small, at just above the size of a US postage stamp, Curie goes even further. Intel has not revealed exact dimensions, but the overall package appears to be the size of a US dime, or small button. The circular PCB shape is also fairly unique and novel. Curie is so small it could theoretically be integrated into rings.
    Despite its size, Curie packs in a reasonable amount of functionality. Curie is headlined by a new SoC Intel claims is their first purpose-built for wearables: the Quark SE. Curie was only just announced so we do not have complete specifications yet but I have compiled what is available.
    Intel Curie Intel Edison Development Platform
    CPU Quark SE @ ? MHz Dual-Core Silvermont Atom @ 500MHz +
    Quark @ 100MHz
    RAM 80kB SRAM 1GB LPDDR3 (2x32bit)
    WiFi / BT "BT Low Energy" 2.4/5GHz 802.11a/b/g/n, BT 4.0
    Storage 384kB flash 4GB eMMC
    I/O Battery charging PMIC SD + UART + SPI + GPIO + USB 2.0 OTG
    OS Open source Real-Time OS Yocto Linux v1.6 (CPU)
    Open source Real-Time OS (MCU)
    Dimensions Approx. US dime
    (~18mm diameter)
    35.5 x 25 x 3.9 mm
    Sensors Integrated DSP sensor hub with pattern matching 6-axis combo sensor (accelerometer and gyroscope) -
    Intel did not specify if the Bluetooth antenna was built into the PCB or needs to be added on. As Curie integrates sensors and a battery charging PMIC directly whereas Edison provides interfaces to connect to those same features, it is clear Intel designed Curie to be stand alone. Therefore, accounting for the other hardware that needs to be built around Edison, the size difference grows.
    Intel's Curie does not include an applications processor and instead relies entirely on the MCU. This may seem limiting compared to powerful Galaxy Gear or Android Wear devices, but there are many devices such as Fitbit and even the Microsoft Band that also exclude an applications processor. This should enable Curie to be exceptional for battery life, however Intel provided no power consumption figures.
    Curie will ship in 2H 2015 and be bundled with Intel IQ Software Kits. Intel IQ is a set of algorithms, device software, smartphone apps, and cloud integration (management, analytics, user and company portals) and breaks down into two components: Body IQ and Social IQ with corresponding biometric and connectivity focuses.



    More...

  2. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,807
    Post Thanks / Like
    #4612

    Anandtech: MediaTek Demonstrates 120 Hz Mobile Display

    While we often don’t deeply discuss MediaTek as a company, they are a major force in the mobile space. Their SoCs are widely used in the mid-range and budget segments of the mobile market, and they have widespread OEM adoption due to their turn-key reference designs. However, despite this mid-range positioning we saw an interesting demo of 120 Hz mobile displays at their CES press event, which can be seen below.
    While the video is in slow motion to demonstrate the difference, in practice the benefit of the higher refresh rate is still quite visible. Text scrolling and motion was visibly clearer and more fluid, although it’s possible that displays with poor refresh rates wouldn’t see nearly as much benefit. MediaTek claims that this feature would increase display power consumption by about 10%, although it’s unclear whether this is with dynamic refresh rate adjustment or constant refresh rate. Features like this seem to be part of MediaTek’s new strategy of bringing value to the mid-range, and it will be interesting to see if MediaTek’s focus on Asia will continue to pay off.


    More...

  3. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,807
    Post Thanks / Like
    #4613

    Anandtech: Dell CES Hands On

    We got a chance to gets some hands on time with the new Dell products yesterday. Likely the biggest announcement was the Dell XPS 13, with what Dell is calling an Infinity Display. The Dell Venue 8 7000 tablet also has the Infinity Display. It was a great looking piece. Also on display was the Alienware products, with the top of the line Area 51 showing prominently. The Alienware 13 was also on display, with the external graphics amplifier. New to the lineup is the slimmed down versions of the Alienware 15 and 17, both of which also support the external graphics amplifier.
    The Area 51 is Alienware’s top gaming desktop, available with X99 based CPUs and multiple graphics cards. It was announced a couple of months ago, and the first thing anyone will notice is the distinct shape and styling of the case. The internals are mounted on an angle, which is said to enhance cooling.
    Next we got to check out the laptops from Alienware. The 13 was previously announced but the new models are the 15 and 17, both of which share styling with the 13. They are much thinner than their predecessors. As seems to be the case with Alienware, almost everything has backlights including the trackpads. It is an interesting effect, and is fully customizable and owners can set the lighting to whatever color they like, or disable it if they prefer. On top of this, if someone wants more graphics power than their internal GPU can handle, the graphics amplifier allows pretty much any graphics card to be leveraged to power either the internal display or external monitor if you want more of a dock experience.
    Gallery: Dell CES Hands On Alienware


    To me the most exciting thing from Dell was the new XPS 13. Dell basically managed to fit a 13” display into an 11” chassis, and the result is a super thin bezel which Dell calls Infinity Display. It was striking to look at, and according to Dell’s representative the display is IGZO, which should mean that it has the standard RGB stripe. Broadwell is the name of the game this CES, and the XPS 13 is powered by the new Intel CPU. Also on display was the updated XPS 15, which is now sporting a 4K display.
    Also on the show floor was a couple of tablets from Dell. Dell’s Venue line is divided between Windows and Android with the “Pro” name. Dell’s recently launched Venue 11 Pro was available to see, and this 11” tablet is offered with either Atom or Core-M processors.
    The Venue 8 7000 tablet was a striking to see in the dark room, with its 2560x1600 OLED display packing quite a punch and having the incredible blacks that OLED is known for. It is Intel powered, and performance was quite good. This will be Bay Trail of course, with Cherry Trail just now shipping to OEMs.
    Dell has some great new products coming out, and I love to see how coordinated the styling is getting among the manufacturers. We look forward to getting some of these in for review to dig into them some more.


    More...

  4. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,807
    Post Thanks / Like
    #4614

    Anandtech: AMD Shows Off Multiple FreeSync Displays

    We met with AMD and among other things, one item they wanted to show us was the essentially final versions of several upcoming FreeSync displays. They had several displays running, including a 28” 4Kp60 display from Samsung, a 27” 144Hz QHD display from BenQ, and a 75Hz 2560x1080 34” display from LG. The three displays mentioned were all running on different GPUs, including an R9 285 for the BenQ, R9 290X for the Samsung display, and an A10-7850K APU was powering the LG UltraWide display.
    More important than the displays and hardware powering them is the fact that FreeSync worked just as you’d expect. AMD had a demo running with a large vertical block of red moving across the display. We could then enable/disable FreeSync and V-SYNC, we could set the target rendering speed from 40 to 55 Hz in 5Hz increments, or we could set it to vary over time between 40 Hz and 55 Hz. The display meanwhile was able to show the current refresh rate, and with FreeSync enabled we could watch the fluctuations.
    Having seen and used G-SYNC, there was nothing particularly new being demonstrated here, but it is proof that AMD’s FreeSync solution is ready and it should be available in the next few months. What remains to be seen now is just how much more FreeSync displays will cost on average compared to non-FreeSync displays. FreeSync is royalty free, but that doesn’t mean there’s no additional costs involved with creating a display that works with FreeSync. There’s a need for better panels and other components which will obviously increase the BoM (Bill of Materials), which will be passed on to the consumers.
    The real question will be how much FreeSync displays end up costing compared to G-SYNC equivalents, as well as whether Intel and others will support the standard. If FreeSync does gain traction, it will also be interesting to see if NVIDIA begins supporting FreeSync, or if they will remain committed to G-SYNC. Anyway, we should start to see shipping hardware in the near future, and we’ll get answers to many of the remaining questions over the coming year.
    Gallery: AMD Shows Off Multiple FreeSync Displays



    More...

  5. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,807
    Post Thanks / Like
    #4615

    Anandtech: AMD Demonstrates Working Carrizo Laptop Prototype

    In our visit with AMD we got to see something I wasn’t really expecting: a functioning Carrizo laptop. (Note that AMD wouldn't let us take pictures, but they did provide some pictures for us to use.) AMD apparently only received initial silicon back from the fab a few weeks back, and they already have a laptop up and running with the early hardware. In fact, not only did they have a functioning Carrizo laptop but they also had several other working Carrizo systems running Windows. Of course, last year AMD had Kaveri up and running and that launched about five months later, so we’re a bit earlier than that for Carrizo but it’s coming along nicely.
    One of the features of Carrizo is full support for H.265 decoding, and as an example of why this is needed they had an Intel system running next to the Carrizo system attempting to playback a 4K H.265 video. While the AMD system was easily able to handle the task without dropping any frames, the Intel system was decoding at what appeared to be single digit frame rates. The 4K content was essentially unwatchable on Intel. Of course that’s easy enough to remedy by adding an appropriate GPU that can handle the decoding, but AMD’s point is that their APU on its own is able to do something that a high-end Intel CPU cannot do without additional hardware.
    As far as other aspects, we do not have any details on the system specifications or expected final clocks. I did see the clock speed of the prototype laptop, but it’s certainly not final so there’s not much point in going into more detail. AMD also indicated that their eventual goal is to have the prototype laptop equipped with a discrete GPU for Dual Graphics support, but that isn’t in the current prototype.
    In terms of using the system, we were unable to run any benchmarks or really do anything more than open Windows Explorer and the system properties. Given this is early hardware there are sure to be some kinks to get worked out over the coming months. AMD is still on track for a Q2/Q3 release of Carrizo, and we’re looking forward to seeing what the Excavator core can do in terms of performance. Also note that the GPU will be “Next Generation” GCN (from the redundant department of redundancy?), with support for DX12. It should be an interesting fall when Carrizo ships.
    Gallery: AMD Demonstrates Working Carrizo Laptop Prototype



    More...

  6. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,807
    Post Thanks / Like
    #4616

    Anandtech: Hands On with Gigabyte Notebooks at CES

    One of the notebook vendors that I’ve been really interested in visiting is Gigabyte, as their P-series offerings look quite promising. Aesthetics are of course a rather subjective topic, but I do like the fact that Gigabyte tends to be a bit less gaudy with their own brand of gaming notebooks. Of course they also make the AORUS brand of gaming notebooks that is far less subdued, but we’ll cover that in a separate piece.
    Starting with their highest end offering, the Gigabyte P37X is a 17.3” notebook that comes with a GTX 980M GPU, and it’s the thinnest 17.3” laptop to be equipped with that GPU. I supports a pair of mSATA SSDs and can hold up to two 2.5” drives (if you are willing to give up the optical drive). Cooling consists of dual fans, and the keyboard also includes a macro hub. Perhaps most importantly, the display is a 1080p IPS (or AHVA?) panel, providing for wide viewing angles and generally better colors than TN displays. The P37 will be available in three different SKUs, with the P37X targeting a price of $2100-$2300 (with GTX 980M); the P37W will cost $1750-$2000 (GTX 970M); and the P37K will cost $1150-$1350 (GTX 965M). Availability is expected in February.
    Up next and very similar in appearance to the P37 is the P35, now in its third iteration. It has basically all of the same features as the P37, only with a 15.6” screen and slightly smaller chassis. What’s truly surprising is that Gigabyte has managed to put a full GTX 980M 8GB into a laptop that’s only 20.9mm thick – along with two mSATA SSDs, a 2.5” drive, and either a slim optical drive or a second 2.5” drive bay. Unlike the 17.3” market, HiDPI displays are readily available for 15.6” laptops, and the P35 comes with a WQHD+ (3200x1800) IPS display. There are three SKUs again, the P35X v3 is at the top and comes with GTX 980M with a price of $2100-$2400, depending on configuration; P35W v3 comes with GTX 970M and will cost $1750-$2000. Both of these SKUs already began shipping in late 2014. Finally, a third SKU, P35K v3, was launched at CES with the newly released GTX 965M; it will cost $1500-$1600 and should be available this month.
    Last but not least, the P34 is also in its third iteration, and it should begin shipping this month or next. It has a 14” 1080p IPS display and features a single mSATA SSD and a 2.5” bay. It’s as thin as the P35, 20.9mm, and weighs an impressively light 1.71kg. The P34W v3 comes with a GTX 970M with a price of $1700-$1900 and should ship this month. The P34K v3 is the latest addition with a GTX 965M and a price of $1500-$1600, and it should ship next month.
    In terms of our hands on time with the devices, all of them feel relatively solid and it’s great to see Gigabyte opting to avoid the use of any TN displays. Build quality seems good if not exceptional – the chassis is mostly plastic, but still relatively solid to handle. The keyboard and touchpad also seem good, though the 10-key layout on the P35 in particular basically breaks all the 10-key rules and will not be especially useful. The only real question I have is whether the laptops can truly cope with the level of hardware they’re using, as substantially larger laptops sometimes get a bit hot.
    We’ve asked for the chance to review all of the latest P3x offerings, though not all at once, and we hope to be able to test the P35X v3 in particular. GTX 980M with a 3K display in a very thin 15.6” chassis happens to press all the right buttons for me. Let’s hope it can live up to our high expectations.
    Gallery: Hands On with Gigabyte Notebooks at CES



    More...

  7. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,807
    Post Thanks / Like
    #4617

    Anandtech: AORUS Notebooks Updated with New X5 and X3 Plus

    If you think the rather subdued (some might even go so far as to say drab or boring) Gigabyte gaming notebooks aren’t for you, the company has their AORUS brand to perhaps win you over. These feature much more aggressive styling and definitely go for the gamer vibe, with a black and angular “stealth” aesthetic. AORUS was launched last year as a high performance gaming notebook brand, and over the year we’ve seen a few updates. Initially consisting of the X7 model with SLI, AORUS has now been expanded with both 13.9” X3 and 15.6” X5 models. The latter is the latest and greatest, with some impressive specs as well.
    The AORUS X5 starts out with NVIDIA’s new GTX 965M, which was rather quietly released at this CES on Tuesday. It’s basically a trimmed down version of the GM204 chip, coming in slightly below the GTX 970M in terms of performance but with a lower price as well. Except, the AORUS X5 has not one but two GTX 965M GPUs in SLI, pushing performance above the level of a single GTX 980M in some benchmarks. I’ll admit that I’m not really a huge fan of SLI laptops, as I’d rather have the best single GPU solution available before shifting to SLI, and the price isn’t really any lower than laptops with a single GTX 980M (e.g. Gigabyte’s own P35X v3). Still, for those times where SLI works as it should, there’s a bit of extra performance available and spreading the heat between two GPUs may have some minor benefits.
    Perhaps more important than the SLI 965M however is the use of a 4K display, and an IGZO panel at that. IGZO (Indium Gallium Zinc Oxide) is a superior alternative to normal Zinc Oxide TFT LCDs. The basic summary is that IGZO generally allows for brighter and better images while using less power, but the cost is higher as well. Needless to say, the display looks quite stunning and while even SLI 965M might struggle to run a lot of games at native 4K with maximum details, for video and multimedia use in particular 4K can be great. Note that there is a non-4K display option as well, for those that don’t need (or want to pony up for) 4K.
    The AORUS X5 weighs 2.5kg and is still only 22.9mm thick, which is impressive considering the hardware packed inside. The X5 supports up to three M.2 SSDs in RAID 0 with a 2.5” bay as well. It also has four SO-DIMM slots, supporting up to 32GB of RAM. MSRP ranges from $2299-$2799 depending on configuration. Availability however isn’t expected until Q2, 2015, so if you want the X5 you’re going to have to wait a few more months.
    Stepping down a bit in size we have the other new AORUS, the X3 Plus. According to Gigabyte, this is the world’s lightest gaming laptop with a GTX 970M, and it tips the scales at just 1.87kg (4.11 lbs). That’s not actually a true statement, then, as Gigabyte’s own P34W weighs a bit less (1.71kg), but we’ll let it slide. As with the X5, the X3 Plus features an IGZO display, this time a 13.9” IPS QHD+ (3200x1800) resolution panel. It looks great in person, and with the slightly smaller chassis in many ways it’s the most compelling of the AORUS offerings. It also includes up to 3xmSATA SSDs in RAID 0 (and no HDD). Pricing is set at $2199 with availability set for this month.
    Finally, the AORUS X7 Pro launched in late 2014, sporting SLI GTX 970M graphics. Other than the larger chassis and scree, most of the specs are similar to the X5 – three mSATA drives in RAID 0, a single 2TB HDD, and four SO-DIMM slots for up to 32GB RAM. Again I applaud the fact that Gigabyte has managed to source an IPS (or PLS or UHVA?) panel for the X7 Pro, as finding IPS 17.3” displays can be rather difficult. Gigabyte is doing the right thing with their high-end laptops by simply avoiding any budget TN panels, and I wish more gaming notebooks would follow their example. Performance of the SLI 970M should be faster than any other gaming notebook other than those with SLI 980M (Gigabyte claims 30% better performance than a single 980M), and with a weight of 3.0kg (6.6 lbs) this is lighter than any other 17.3” gaming notebook that comes to mind. The X7 Pro is already available with an MSRP of $2599.
    In addition to the AORUS laptops, Gigabyte is also making additional gaming products under the AORUS brand, including a keyboard, mouse, and backpack. Note that unlike the Gigabyte notebooks, AORUS laptops do not support NVIDIA’s Optimus Technology for switchable graphics – though that’s not an option for the SLI models. I believe that manual switching is available, and for gaming notebooks it’s probably not a huge deal. All of the AORUS laptops also feature a block of macro keys on the left side of the keyboard. We’ll hopefully be able to review some of the AORUS models in the coming months as they become available.
    Gallery: AORUS Notebooks Updated with New X5 and X3 Plus




    More...

  8. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,807
    Post Thanks / Like
    #4618

    Anandtech: NVIDIA Launches GeForce GTX 965M

    In a rather unusual turn of events, the first I officially knew of the GeForce GTX 965M was when I saw it in some of MSI’s new gaming notebooks on Sunday. NVIDIA usually briefs us on new product launches, but for whatever reason they didn’t feel the need to send us anything in advance. (Maybe it might have something to do with Tegra X1 and all the car stuff being shown at CES?) Officially, NVIDIA unveiled the new GPU two days ago on January 6, but we’ve been busy running around Las Vegas so this is the first chance I’ve had to say a bit more about the new GPU. Let’s get into the details.
    Given the quiet nature of the launch, you can already guess that this isn’t a new GPU core. Basically we’re getting another GPU for laptops based on GM204, and it’s one step down from GTX 970M placing it somewhere around the level of the GTX 870M but with updated features courtesy of GM204. In terms of specifications, the GTX 965M is has 1024 CUDA cores clocked at 944MHz, plus boost clocks, and the GDDR5 memory uses a 128-bit bus and is clocked at 5GHz. By way of comparison, GTX 970M has 1280 cores clocked at 924MHz (plus boost) with a 192-bit GDDR5 5GHz interface. This effectively puts performance at roughly half the speed of the desktop GTX 980, or around 2/3 the performance of the GTX 980M (depending on memory bandwidth use).
    NVIDIA unfortunately doesn’t disclose any information on TDP or pricing with their notebook GPUs, and it’s not clear if the GTX 965M will come with 2GB or 4GB of VRAM (or perhaps both configurations will be available). That said, it’s a safe bet that GTX 965M will use less power than GTX 970M, which in turn uses less power than the GTX 980M (around 100W). We will hopefully see some notebooks with GTX 965M in the near future for testing, and considering the number of notebooks we saw sporting the new GPUs we expect that to occur sooner rather than later.


    More...

  9. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,807
    Post Thanks / Like
    #4619

    Anandtech: Crucial Announces MX200, BX100 SSDs & SSD Toolbox

    Crucial's MX100 has been one the most successful SSDs on the market. Its very aggressive pricing along with decent performance and great feature set has made it an excellent buy for mainstream users. Here at CES Crucial just introduced the MX200, the successor of MX100, and a new budget-oriented model called the BX100.
    The MX200 is essentially the branded version of Micron's M600 that we reviewed earlier. The notable change compared to the MX100 is that the MX200 features Dynamic Write Acceleration (DWA), which is Micron's/Crucial's SLC cache implementation. I covered the feature in detail in our M600 review, but in short the SLC cache size is adaptive and changes depending on how much data the user is storing in the drive (unlike e.g. Samsung's and SanDisk's implementations where the SLC cache size is fixed). I wasn't very impressed by the performance of the M600 and DWA, but what DWA does provide is higher endurance since SLC is significantly more durable. Crucial is rating the 250GB version at 80TB, 500GB at 160TB and 1TB at 320TB, which is a notable increase over the 72TB that the MX100 had.
    Otherwise the MX200 is very similar to the MX100. It's a Marvell 88SS9189 based design with Micron's 16nm 128Gbit NAND and as usual the MX200 features DevSleep, TCG Opal 2.0 and eDrive encryption. MSRPs are $140 for 250GB, $250 for 500GB and $470 for 1TB, which is certainly a bit more compared to the MX100. M.2 and mSATA models are also available, though the capacities only go to up to 500GB. Availability will be later in this quarter and we are expected to get samples in the next couple of weeks.
    The other SSD that Crucial is launching is more interesting. The BX100 will be Crucial's entry-level drive (the B stands for budget) and the intriguing part is that Crucial is using Silicon Motion's 2246EN controller with 16nm 128Gbit NAND, which is change from Crucial's usual Marvell designs. Actually, the BX100 is the first drive from a NAND OEM to ship with a Silicon Motion controller, so that is certainly a big design win for the company. I've been pretty pleased with the 2246EN and it has done well in our tests, so I can see why Crucial chose to go with that one.
    Feature wise the BX100 drops all the M-class features, so there is no hardware-accelerated encryption or SLC caching. Pricing is $70 for 120GB, $110 for 250GB, $200 for 500GB and $400 for 1TB, so it's very competitively priced like the MX100, although given the lack of features I would have like to see a bit lower pricing since the MX100 currently retails for about the same prices. Availability is also Q1'15 and we will be getting samples soon.
    Finally, after a long period of waiting, Crucial is launching its own toolbox for SSDs, called the Crucial Storage Executive. The 1.0 version is a fairly basic toolbox with support for firmware updates, drive monitoring, secure erase and PSID revert, although Crucial has plans to add more features in the future. Supported drives are currently the M500, M550, MX100, MX200 and BX100 and the software is already available for download from Crucial's website.
    Crucial MX200 Product Page
    Crucial BX100 Product Page
    Crucial Storage Executive


    More...

  10. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,807
    Post Thanks / Like
    #4620

    Anandtech: Microsoft Demonstrates Updated Lumia Camera App At CES

    Last night I got a chance to meet with Juha Alakarhu, head of the Lumia Imaging technologies, to get a demonstration of the updated Lumia Camera app that will be coming soon. The new app will be enabled by the Denim firmware update, but due to the ISP requirements it will only be available on the Nokia devices that shipped with the Snapdragon 800 series of SoC, which is the Lumia 1520, and the Lumia Icon/930 at the moment. Some of the changes are pretty dramatic, so hopefully some parts of the new app will be available to the Snapdragon S4 and 400 devices as well.
    The first major change is the camera startup time. When Windows Phone was first launched, one of the features touted was “pocket to picture” with the dedicated camera button. As time has gone on, the camera button is no longer a requirement, so this has fallen off a bit to the detriment of the platform. But an even larger issue on the Nokia Camera app was how long it would take to be ready to take a picture. For example, with the Lumia 735, launching the Lumia Camera app takes about 4 seconds or so to load and be ready to take a photograph. Juha discussed how Microsoft has worked to create a camera app that can load almost instantly. On the updated Lumia 930 with Denim, launching the camera app now takes around a second. I don’t have my equipment in Vegas to do a time lapse and get the exact amount of time, but pressing the camera button on the phone has the camera app ready to do almost instantly. I cannot imagine that these changes will not be made available to all of the Lumia phones, as any more efficient code is going to be even more pronounced on the slower processors of those devices.
    Microsoft also reworked the algorithms on the image quality side, as they tend to do. Although I do not have an older 930 to test with anymore, they did have some side by side comparisons with the old algorithms and the new ones, and the image quality is even better than before.
    Another area where Microsoft has and Nokia have both been missing a major feature compared to their rivals is in the area of High Dynamic Range (HDR). HDR has been an option on Android and iOS for a long time, and with the relatively limited dynamic range of smartphone camera sensors, many people really like the extra pop that HDR can bring to an image. The HDR on the Lumia is not just a standard HDR though. Microsoft is calling their implementation “Rich Capture” and it covers a couple of features in addition to HDR. But the basic implementation is that like any other HDR setting, it takes several bracketing photos at different exposures and then merges them together to create a single image. The Lumia Camera implementation will allow you to customize the exposure after the fact by keeping all of the photos together, and you can adjust how much HDR you want to add to the image when you are done. It is a slick implementation, and the added customization is always welcome.
    Gallery: Lumia Rich Capture


    Likely the most impressive addition to Rich Capture though is a new feature that I believe is unique to Lumia called Dynamic Flash. With smartphones, the flash is almost necessary due to the small sensor size for any sort of low light imaging. With OIS and larger sensors and pixels on some of the phones, you can get away sometimes without using the flash except in the most extreme circumstances. Apple’s solution was to implement a dual flash, which adjust based on the scene. This requires a specific flash in the device though. The Lumia solution is that the camera takes two photos – one with the flash and one without the flash. It then blends them together to keep the shot from being blown out by the harsh LED flash. And the Dynamic name comes in because you can adjust the amount of flash in the image after the fact. Juha discussed how he likes to keep the flash on even for daytime shots now, just so that he can add a bit of sparkle to the eyes in his images. This is a fantastic addition and is really amazing to see for the first time.
    Gallery: Dynamic Flash


    Another area where the Lumia camera fell short was in shot to shot time. Once again, I do not have all of my test equipment here so I will see how much of a difference they have made, but now they have changed the app to allow you to take a new photo while the last image is still being processed. Shot to shot is still not up there with an iPhone, but it does seem better overall. When I get back home I will test the new camera against the numbers from our 930 review and see what the increase is.
    However, there is another solution to shot to shot time that has been added as well. The new Lumia Camera app now supports 4K video at 30 frames per second. 4K is about an eight megapixel image, so the app is now set up so that if you hold the shutter button, it will start recording at 4K. When you release the button, the recording stops. You can then open the video, and you have the option to choose the best frame from the video to use as a standalone image. The demonstration we were given was a skateboarder jumping over a person on a bench, and you could freeze the moment when the jumper was right at the peak of his jump. It is useful, although you are limited to the eight megapixel image even though it is a twenty megapixel sensor. Hopefully a future update supports full burst mode.
    And although I have already given this away, the final update to Lumia Camera is the 4K video support. Initially, Microsoft had stated that this would be limited to 24 FPS but that appears to have changed and now it is offered in 24, 25, or 30 FPS. You can also choose 720p, 1080p, or 1440p in any of those framerates. In addition, the Lumia Camera app still supports 5.1 Dolby Digital surround sound recording for video with the 4 HAAC microphones.
    This is a fantastic update to the Lumia Camera, and one that was much needed. Some of the features will not be possible on the lower tier phones due to the ISP included, but hopefully the new processing algorithms and especially the startup time can be added. And just because people want to know, no, there was no discussion yet of a Lumia 1020 update.


    More...

Thread Information

Users Browsing this Thread

There are currently 17 users browsing this thread. (0 members and 17 guests)

Tags for this Thread

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
Title