Page 767 of 1210 FirstFirst ... 267667717742757762763764765766767768769770771772777792817867 ... LastLast
Results 7,661 to 7,670 of 12095

Thread: Anandtech News

  1. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,807
    Post Thanks / Like
    #7661

    Anandtech: Smooth As 240 Hz Butter: LG's 27GK750F-B, a 27-inch eSports Monitor with F

    LG has greatly expanded its lineup of monitors for gamers over the past few years. The company targets mainstream and premium segments of the market, but this month it decided to offer something for “eSports” gamers who demand maximum refresh rates and the lowest possible response time to hit their enemies first. In addition to the rather extreme peak refresh rate, the monitor also supports LG’s Motion Blur Reduction technology to make fast-paced actions look sharper.
    The LG 27GK750F-B is outfitted with a 27” TN panel with an FHD (1920×1080) resolution that has a refresh rate of up to 240 Hz as well as a 3H antiglare coating. The panel can display 16.7 million (8-bit) colors, has a 400 cd/m2 typical brightness (a bit higher compared to average monitors), and features 170°/160° viewing angles (which is typical of TN technology). LG claims that the native GtG response time of the panel is 2 ms, but it “shrinks” to 1 ms once backlight strobing is activated. LG calls this feature Motion Blur Reduction (MBR) and it looks like it works the same way as NVIDIA’s ULMB — by inserting a black frame after each normal one. LG’s MBR does not work with AMD’s FreeSync and naturally lowers luminance, but when you need to make things look sharper to shoot your enemies more accurately, this looks like a fair trade-off.
    Since the monitor was designed for gamers, it is natural that it comes with various enhancements just for this audience. LG highlights such features as the Black Stabilizer to make dark areas lighter, the Crosshair that always displays target point in the center of the screen, and various modes tailored for FPS and RTS genres.
    Moving on to connectivity and ergonomics. The LG 27GK750F-B has one DisplayPort 1.2a as well as two HDMI 2.0 inputs (so, one for a high-end PC, two for game consoles). There is also a dual-port USB 3.0 integrated hub and a 3.5-mm audio jack for headphones. To make the device more comfortable to use, LG equips it with a stand that can adjust height, tilt, swivel and even lets the user have the monitor in portrait mode. The display lacks VESA mounting, although this might not be a problem for eSports players. Furthermore, since many of the latter do not really need additional bling, bells and whistles, LG decided not to equip its 27GK750F-B with customizable RGB LEDs and limited decorations to a minimum — a couple of red inlays here and there.
    LG's 27" Class Full HD Gaming Monitor with FreeSync
    27GK750F-B
    Panel 27" TN
    Native Resolution 1920 × 1080
    Refresh Rate 240 Hz
    Dynamic Refresh Rate AMD FreeSync*
    *Not available when LG MBR enabled
    Response Time 2 ms (gray-to-gray)
    1 ms with Motion Blur Reduction
    Brightness 400 cd/m²
    Contrast 'mega'
    Viewing Angles 170°/160° horizontal/vertical
    Color Gamut NTSC: 72%
    Inputs 2 × HDMI 2.0
    1 × DisplayPort 1.2a
    USB Hub Dual-port USB 3.0 hub with Quick Charging
    Proprietary Enhancements Motion Blur Reduction
    Black Stabilizer
    Dynamic Action Sync
    Crosshair
    Flicker Safe
    Built-in presets for FPS, RTS games, browsing/reading
    Power Consumption Idle ~0.5 - 0.5 W
    Active 30 W
    Stand Adjustments Tilt -5º to 15º
    Swivel -20º to +20º
    Height 110 mm
    Pivot 0º to 90º
    Detailed Information Link
    Check Availability Amazon
    Gallery: LG Targets ‘eSports’ Gamers with 27GK750F-B: 27-inch, FHD, 240 Hz & FreeSync


    The LG 27GK750F-B display is listed on the manufacturer’s website and is priced at $549.99. The monitor is currently unavailable from major retailers like Amazon or Newegg, but expect it to hit the shelves in the coming weeks.
    Related Reading





    More...

  2. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,807
    Post Thanks / Like
    #7662

    Anandtech: Micron Finishes GDDR6 Internal Qualification, Eyes H1'2018 Mass Production

    With rumors swirling pretty widely that the next generation of video cards from both vendors will be based around GDDR6 memory, there’s understandably a lot of interest in the development and release status for the new memory. While GDDR6 is not set to ship until 2018, 2017 is now just weeks from coming to a close, meaning “later” is quickly becoming “sooner” and vendors are releasing increasing amounts of information about their GDDR6 plans.
    No exception to that rule is Micron, who published a blog post this morning recapping their 2017 graphics memory efforts and outlining their production initiatives for 2018. The company already has a foot in the door on the next-generation memory game with GDDR5X, and as previously announced by the company, they’re looking to pivot that experience into producing GDDR6.
    Getting to the meat of Micron’s blog post then, the company has revealed today that they have finished the design and internal qualification of their 12Gbps and 14Gbps GDDR6 chips. This means that at least internally, they have completed what’s arguably the hardest parts of GDDR6 development from the memory side of matters, and now they can focus on mass production and the development of even higher speed bins. Interestingly, according to the company they actually came in a bit ahead of schedule here, as they weren’t originally planning to be done with internal qualification quite this soon.
    As far as mass production goes then, Micron has previously discussed reaching mass production of GDDR6 in the first half of 2018, and as of their latest update that’s still the goal. In fact the company is already sampling the chips. Notably here, they’ve said that they’re “pushing” for H1 on mass prodcution, but with internal qualification slightly ahead of schedule, it puts them in a better position to hit their goals.
    Meanwhile the company’s chip designers are now buckling down to work on the next set of GDDR6 speed bins, including 16Gbps GDDR6. This is an area in particular that the company is looking to leverage their GDDR5X experience, as the biggest challenge at this point is the signaling rather than the memory cells themselves. Successive GDDR standards have generally only made modest changes to the internal clockspeed of the memory cells, while the interface speeds themselves have continued to ratchet up. And while GDDR5X and GDDR6 have some notable differences between them, Micron’s existing experience with 12Gbps+ GDDR5X is generally applicable to GDDR6 as well. However at least for the moment, Micron isn’t talking about when a 16Gbps speed bin may be available since it’s so early into development.
    Finally, while GDDR6 will be the new cutting edge graphics memory for 2018, the success of GDDR5 over the past decade means that memory isn’t going anywhere any time soon, and Micron’s blog post notes that the company is continuing to invest in that as well. The company has started mass production and shipping of 8Gb GDDR5 chips built on their new “1Xnm” DRAM manufacturing process, which is part of their long-term plans for continuing GDDR5 production. Even in a best-case scenario, GDDR6 is likely to hold a price premium for quite some time, so GDDR5 is going to be the more reasonably priced memory option for mainstream video cards and other devices that need the higher bandwidth of GDDR memory, but not the speeds (or costs) of GDDR6. The company has also noted that the new DRAM offers “increased speed margins” but hasn’t elaborated on just what that means.


    More...

  3. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,807
    Post Thanks / Like
    #7663

    Anandtech: ZOTAC Unveils AMP Box and AMP Box Mini eGFX TB3 Chassis

    ZOTAC has been working on its external chassis for graphics cards with a Thunderbolt 3 interconnection for well over a year now. Apparently, according to ZOTAC, the time it spent was worth it. On Thursday, the company announced not one, but two eGFX TB3 enclosures targeting different audiences and offering different features. The AMP Box is designed for those demanding maximum performance, whereas the AMB Box Mini is aimed at people seeking a quiet and portable solution.
    About 1.5 years after the first external chassis for graphics cards with TB3 emerged, the market for such solutions seems to be doing quite well and so it begins to segment into niches. Initially, companies like ASUS, AKiTiO, PowerColor, Razer, and others only offered eGFX boxes for large desktop graphics cards offering maximum performance and consuming a huge amount of power. Then we saw GIGABYTE and GALAX/KFA2 launch rather compact eGFX chassis with pre-installed graphics cards targeting casual users who do not want to assemble anything themselves and do not want and an add-on component that is larger than an SFF PC. Now ZOTAC is releasing two separate solutions (another signal that the market is well established): the AMP Box for DIY enthusiasts that is compatible with high-end graphics cards, and the AMP Box Mini for owners of ZOTAC’s tiny ZBOX computers with a TB3 port as well as everyone who wants a compact eGFX enclosure.
    The large ZOTAC’s AMP Box uses aluminum chassis that can accommodate a dual-slot graphics adapter that is up to 228.6 mm (9”) long and with two 8-pin PCIe auxiliary power connectors. It should be noted that most high-end reference graphics cards from AMD and NVIDIA are around 31.2 cm (10.5") and they are not going to fit into the AMP Box. Therefore, those who plan to use it will have to get "mini" versions of the GPUs from ZOTAC or other manufacturers. To ZOTAC's credit, it should be noted that the AMP Box is a bit smaller than its competitors from ASUS, AKiTiO or PowerColor. The eGFX TB3 enclosure is equipped with a quad-port USB 3.0 hub (two ports on the front, two on the back, one supports Quick Charging) as well as a 450 W PSU (it is unclear whether we are dealing with a standard, or a custom unit here) to guarantee compatibility with ultra-high-end video cards that need more than “standard” 250 W. Given the wattage of the power supply, it is likely that the AMP Box can also charge a laptop when in use, but the maker has yet to confirm it.
    Since the AMP Box was designed for demanding gamers, it is also equipped with ZOTAC’s Spectra programmable RGB lighting so to add some shine to its grey aluminum outfit.
    The smaller ZOTAC AMP Box Mini that comes in black metallic chassis only fits in dual-slot add-in-cards that are up to 200 mm (7.87”) long and need a single 6-pin PCIe power connector. The manufacturer proposes to use this enclosure for entry-level graphics cards or even high-capacity SSDs. Considering that ultra-compact form-factor desktops, as well as many notebooks, come with rather mediocre CPUs that are barely designed to run demanding games, mainstream graphics cards will be optimal for such systems. Moreover, those who need to attach a laptop or a tiny desktop to three or four monitors do not need maximum performance in games, so this AMP Box Mini may end up quite popular among such users.
    The design of the enclosure is similar to the design of ZOTAC’s MI553 SFF PC, but the box itself is compatible with all of the company’s UCFF systems with a TB3 port, including the CI549 nano, MI549 nano, MI552, and MI572.
    ZOTAC's AMP Box and AMP Box Mini eGFX Chassis vs. Razer Core
    AMP Box
    ZT-TB3BOX
    AMP Box Mini
    ZT-TBT3M-180-BB
    Razer Core V2
    Chassis Dimensions Length 27.1 cm
    10.67"
    18.3 cm
    7.2"
    34 cm
    13.38"
    Height 25.7 cm
    10.13"
    0.99 cm
    3.9"
    21.84 cm
    8.6"
    Width 14.6 cm
    5.75"
    230 mm
    9.06"
    10.5 cm
    4.13"
    Max Dimension of Compatible Graphics Card Length 22.8 cm
    9"
    20 cm
    7.87"
    31.2 cm
    12.2"
    Height
    (PCB+Cables)
    14.5 cm
    5.71"
    Width 4.3 cm
    1.69"
    Maximum GPU Power 250 W 150 W 375 W
    PSU Wattage 450 W external 180 W 500 W
    Form-Factor ? custom internal proprietary
    Cooling Fans (mm) - - 3 × 80 (?)
    Connectivity Thunderbolt 1 × TB3
    Ethernet - 1 × GbE
    USB 4 × USB 3.0 4 × USB 3.0
    SATA -
    DisplayPort -
    Availability 2018 Q4 2017
    Price ? ? $499
    ZOTAC plans to demonstrate the AMP Box and the AMP Box Mini eGFX enclosures at CES next month. The units are expected to hit store shelves sometimes in Q1 or Q2, but their exact prices are unknown.
    Buy ZOTAC GeForce GTX 1080 Ti Mini on Amazon.com
    Related Reading




    More...

  4. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,807
    Post Thanks / Like
    #7664

    Anandtech: AMD Releases Radeon Software Adrenalin Edition 17.12.2

    Last Tuesday, AMD rolled out their major display driver update for 2017: Radeon Software Adrenalin Edition 17.12.1. Not long since, this week AMD has released Radeon Software Adrenalin Edition 17.12.2, focusing solely on bugfixes, though not all fixes pertain to Adrenalin. To recap some of the highlights, Adrenalin brought the new Radeon Overlay, including performance monitoring, OSD, and in-game Radeon Settings changes. The major update also included a number of enhancements and expanded support for Enhanced Sync, ReLive, and other pre-existing features.
    Buy MSI RX 580 Armor OC 8G on Newegg
    Of the documented resolved issues, a good amount is directly related to Adrenalin, alongside general bugs and application-specific issues.
    Adrenalin Related Resolved Issues

    • Radeon Overlay performance metrics may appear and disappear intermittently during updates.
    • Color Temperature controls may change colors on the incorrect display when using the reset option.
    • Radeon Settings Video tab may disappear on some hybrid graphics system configurations after a reboot.
    • Region recording in Radeon ReLive will continue to record when the region window is closed.

    Application-specific Resolved Issues

    • Stuttering during Netflix playback in a browser or via UWP application
    • Star Wars Battlefront II graphical corruption in some areas of the game.
    • Heavy flickering or corruption in Ark Survival Evolved when enabling the performance metrics overlay in Windows 7.

    Other Resolved Issues

    • 3x1 display configurations may experience instability during Eyefinity creation or during gaming.
    • AMD XConnect Technology enabled system configurations may experience an intermittent system hang on hot plug.
    • GPU Display scaling may fail to enable when desktop resolution is set very low.
    • A black screen may be experienced when running full screen games on the Samsung CF791 Radeon FreeSync enabled display.

    Buy PowerColor RX 580 Red Dragon OC 4G on Newegg
    17.12.2 also updates the list of currently open issues:
    Adrenalin Related Known Issues

    • The “Reset” function in Radeon Settings for Display, ReLive, and Video may not work as intended when using Radeon Settings in certain regional languages.
    • Trimming videos may fail to create a thumbnail if the video contains non-English characters.
    • Flickering may be observed on the performance metrics overlay when Enhanced Sync is enabled on some Radeon FreeSync connected displays.
    • Performance Metrics Overlay may hang if enabled when cycling display power off and on.

    Other Known Issues

    • Rise of the Tomb Raider may experience an intermittent application hang during gameplay.
    • Radeon Settings may experience a hang when enabling CrossFire with three or more graphics products.
    • Radeon WattMan may intermittently fail to load profiles for Radeon RX Vega on the global Radeon Wattman page.
    • A random system hang may be experienced after extended periods of use on system configurations using 12 GPUs for compute workloads.
    • The GPU Workload feature may cause a system hang when switching to Compute while CrossFire is enabled; a workaround is to disable CrossFire before switching the toggle to Compute workloads.

    While listed in the release notes for 17.12.1, for 17.12.2 it is not clear if “Upgrading Radeon Software with mGPU RX Vega configurations on X99 chipsets may cause system instability after reboot” remains an open issue.
    The updated drivers for AMD’s desktop, mobile, and integrated GPUs are available through the Radeon Settings tab or online at the AMD driver download page. More information on this update and further issues can be found in the Radeon Software Crimson ReLive Edition 17.12.2 release notes.
    Buy MSI RX 580 Gaming X 8G on Amazon.com



    More...

  5. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,807
    Post Thanks / Like
    #7665

    Anandtech: LG Develops ‘Nano IPS’ LCD, Unveils 32UK950 4K Display with DCI-P3, HDR600

    LG has announced a new kind of IPS liquid crystal panel that features an improved color reproduction. The Nano IPS technology will be used for LG’s upcoming high-end displays due in 2018 and will enable professional-grade DCI-P3 color space coverage on consumer models. One of the first monitors to use Nano IPS will be the 32UK950. The 32UK950 will be LG’s new flagship consumer 4K LCD, and will feature the VESA HDR 600 badge along with an integrated Thunderbolt 3 dock.
    Nano IPS and HDR 600

    LG’s Nano IPS technology will be used on numerous high-end monitors by the company, so it makes sense to examine what manufacturer tells us about it before jumping to the actual product. LG says that it applies nanoparticles to the screen’s LED backlighting to absorb excess light wavelengths and improve intensity, purity as well as the accuracy of the on-screen colors. Controlling spectral output of backlighting is a method generally used to improve IPS LCD panels — quantum dots and Panasonic’s light modulating cells do just that.
    Adjusting backlighting spectral output not only enables to improve color reproduction, but also contrast ratio, but this is where LG’s press release gets vague. It never discloses or even mentions static contrast ratio, yet to get the HDR 600 badge (which is what the 32UK950 has), you need to get a black level of 0.1 nits max, which VESA believes is impossible without local dimming. However, neither local dimming, nor pixel-by-pixel control of backlight intensity, are mentioned in the press release.
    Wrapping things up, we know for sure that LG’s Nano IPS enables the company to offer an improved color gamut by controlling the LED backlighting. The contrast ratio is something that is expected to be improved on new monitors as well, but LG does not say whether its Nano IPS is responsible for that.
    The LG 32UK950

    Among the first monitors to feature the Nano IPS technology will be the LG 32UK950. Its 32” panel has a 3840×2160 resolution, can reproduce 1.07 billion colors and cover 98% of the DCI-P3 color space. The HDR 600 badge clearly points to HDR10 processing capabilities along with up to 600 nits brightness, but LG does not disclose any information regarding its LUTs (look-up-tables) for HDR. LG’s current-generation consumer flagship display (the 32UD99-W) can cover 95% of the DCI-P3 gamut, which was a bit lower than 97% DCI-P3 coverage by the 31MU97-B, a professional display with a 4096×2160 resolution. The upcoming 32UK950 will surpass both models when it comes to gamut coverage.
    Preliminary Specifications of the LG 32UK950
    Panel 32" IPS with Nano IPS technology
    Resolution 3840 × 2160
    Refresh Rate 60 Hz (?)
    Viewing Angles 178°/178° horizontal/vertical
    Color Saturation 98% DCI-P3
    Display Colors 1.07 billion
    3D-LUT supported
    Inputs 1 × TB3
    DisplayPort 1.2 (TBC)
    HDMI 2.0a (TBC)
    Audio Integrated speakers
    Another major selling point of the LG 32UK950 will be an integrated Thunderbolt 3 connectivity with daisy chaining support (enabling to connect two 4Kp60 displays using a single TB3 port on the host). Apart from TB3 headers, we expect the LG 32UK950 to feature regular DisplayPort and HDMI inputs, a USB 3.0 hub and other essential features.
    LG plans to show the 32UK950 at CES trade show early next month. The company does not reveal when it intends to start sales of the new product, or its MSRP.
    Related Reading




    More...

  6. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,807
    Post Thanks / Like
    #7666

    Anandtech: A Budget Home Theater & PC Setup: 4K, HDR, UHD Blu-ray, and More

    The days of bulky home-theater PCs (HTPCs) with built-in tuners, optical disc drives, and integrated media storage capabilities are long gone. In 2017, advancements in the media / home theater space (including the rise in popularity of OTT streaming and rapid adoption of 4K) stabilized. It is now possible to create a relatively budget-friendly home theater setup without the fear of it becoming outdated within a short timespan. Given the market status, we set out to select the components for a modern home theater environment and evaluate some HTPC options. This article goes into the reasoning behind our choice of components and also provides a detailed account of our experiments with a few compact HTPCs.

    More...

  7. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,807
    Post Thanks / Like
    #7667

    Anandtech: LG Announces the 5K UltraWide 34WK95U: A 'Nano IPS' Monitor with a HDR600

    LG has announced the 34WK95U, a new “dream monitor” for prosumers, gamers, multimedia enthusiasts and everyone who needs a large ultra-wide screen along with a high resolution that is beyond 4K. The new 34” display is expected to be available sometimes in 2018, but LG does not say when.
    The LG UltraWide 34WK95U uses an IPS panel with a 5120×2160 resolution (which the manufacturer calls 21:9 5K ultra-wide) and the recently announced Nano IPS technology. The initial benefit of a 'wider than 4K' resolution (5120-wide vs 3840-wide) is for 4K content creators to see a whole video with UI elements down either side. The Nano IPS technology enables the panel to deliver "a broad range of accurate colors", as LG puts it. Given the fact that another Nano IPS-enhanced display from LG features a professional-grade DCI-P3 color space coverage, it is reasonable to assume that the 34WK95U can also cover a comparable percentage of the DCI-P3 color space, which would be something we have not seen on consumer monitors with a 21:9 aspect ratio so far. However, LG does not outright say that its 5K ultra-wide LCD supports the DCI-P3. The display does carry VESA’s DisplayHDR-600 badge, so it supports HDR10 processing and up to 600 nits brightness (but we know nothing about possible local dimming support needed to hit low black levels).

    Moving on to connectivity. A particularly good thing about the UltraWide 34WK95U is the Thunderbolt 3 input that enables to connect the display to an appropriate PC using just one cable and even feed up to 60 W of power back to the host. It is reasonable to expect the display to support other inputs like DisplayPort and HDMI as well as feature a USB 3.0 hub, but LG is not disclosing exact specs of the product right now.
    Over the past few years LG has done a great job popularizing the ultra-wide 21:9 aspect ratio by introducing multiple 21:9 displays itself and selling the panels to other makers of monitors. Without any doubts, ultra-wide displays are very handy for gamers and users who multitask a lot (designers, engineers, traders, photographers, video editors, developers, etc.), but the maximum resolution offered by such monitors up to this point has been 3840×1600. This resolution is enough to playback UltraHD content filmed in an aspect ratio of 2.35:1 or 2.40:1 (movies), but if a user needed to watch/edit 3840×2160/4096×2160 content, or just require more vertical pixels (e.g., for large images or long texts), such displays were not an option. The LG UltraWide 34WK95U solves this problem by going 5120×2160 pixels and granting people who work on 4K content some additional screen real estate for various UI panels when they work on a project fullscreen. Speaking of content creators, it must be noted that the LG34WK95U is a consumer monitor. Even if it supports the DCI-P3, it is most likely tailored for the consumer version of the DCI-P3 color space, not the digital projection version with its different white point and gamma (this is an educated guess rather than official information though).
    LG is set to showcase the UltraWide 34WK95U at CES next month and this is where we going to learn more about this monitor. The company is tight-lipped about ETA and MSRP of the display, but since this is a unique product and some of its predecessors still retail for $1250 - $1600 more than a year after the launch, expect the price of the 34WK95U to be higher than average for a premium consumer LCD.
    Preliminary Specifications of the UltraWide 34WK95U
    Panel 34" IPS with Nano IPS technology
    Resolution 5120 × 2160
    Maximum Brightness 600 cd/m²
    Refresh Rate 60 Hz (?)
    Viewing Angles 178°/178° horizontal/vertical
    Color Saturation "broad range of accurate colors"
    "eye-popping colors"
    "fantastic color reproduction"
    DCI-P3 not confirmed
    Display Colors 1.07 billion
    HDR DisplayHDR-600
    HDR10
    3D-LUT Supported
    Inputs 1 × TB3
    DisplayPort (tbc)
    HDMI (tbc)
    Audio Integrated speakers (tbc)
    In addition to the high-end UltraWide 34WK95U, LG also announced the UltraWide 34GK950G display for gamers. The 34GK950C monitor boasts with a QHD resolution (2560×1440 pixels), NVIDIA’s G-Sync dynamic refresh rate support as well as the Nano IPS treatment for improved colors. The company did not disclose any additional details about the 34WK95U, but since it will be demonstrated at CES next month, more information will be available there.
    Related Reading




    More...

  8. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,807
    Post Thanks / Like
    #7668

    Anandtech: NVIDIA to Cease Driver Development for 32-Bit Operating Systems

    NVIDIA has announced plans to cease developing drivers supporting 32-bit operating systems for any GPU architecture in the near future. All NVIDIA GPU drivers published after the release 390 (due in early 2018) will only support x86-64 OSes. The decision marks an important milestone in the transition of the PC industry to 64-bit computing that has been ongoing for over a decade, but may upset some of the users who still run older machines or those that require drivers for embedded systems.
    NVIDIA driver version 390 will be the final drivers from the company that will support 32-bit Windows 7/8/8.1/10, Linux or FreeBSD. Whatever version comes after, it will only run on 64-bit versions of OSes. The company will continue to release 32-bit drivers containing security fixes till January 2019, but has no plans to improve the performance or add features to such releases.
    The transition of the PC industry from 32-bit to 64-bit has taken a very long time, but it seems to be in the final stages of completion, at least for consumer-based machines. AMD released the first x86-64 processors for PCs in late 2003, whereas Microsoft came up with Windows XP Professional x64 Edition for client computers in mid-2005, kicking off the transition of the PC industry to 64-bit computing. By now, all contemporary x86 processors are 64-bit capable and the vast majority of personal computers in all form-factors come with four or more gigabytes of RAM (that’s the maximum user addressable memory for a 32-bit OS), so the absolute majority of new PCs today run a 64-bit OS. The last remnants of 32-bit machines are often long-standing hold-overs, such as the machine that George RR Martin writes his Game of Thrones novels on (although he is still DOS based).
    There other reason that springs to mind for using a 32-bit OS is for embedded systems, such as those running point-of-sale type environments, display walls, gambling machines, or monitoring. Although these systems are typically deployed for the super-long term, at some point between now and 19th January 2038 they will have to be replaced due to the Unix Epoch time-rollover bug that will affect all 32-bit systems (bug? feature? oversight! Surely no-one will still have a 32-bit system in 2038, right?).
    According to Steam hardware survey there are 2.19% of users who continue to use 32-bit Windows. Those 32-bit systems are legacy machines and PCs that have to run old programs or hardware natively. Owners of such computers are hardly interested in the latest GPUs (modern graphics cards may come with more memory than a 32-bit OS can address) or driver features, so the end of support will likely go unnoticed by the vast majority of involved parties. Meanwhile, 2% – 2.2% out of a billion of PCs in use worldwide is 20 – 22 million systems (the actual number will be higher because not all PCs run Steam). Therefore, there definitely will be disgruntled owners of 32-bit PCs running entry-level GPUs released in the recent years and now left without updated driver support for the latest GPUs.
    Related Reading





    More...

  9. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,807
    Post Thanks / Like
    #7669

    Anandtech: Cheap 75-144Hz FreeSync: AOC Unveils G90 Gaming Monitors from €179

    AOC has introduced its new family of entry-level displays that offer a refresh rate up to 144 Hz, a 1 ms response time, and AMD’s FreeSync technology. The G90 monitors do not belong to the premium AGON lineup designed for serious gamers, but they still take some pages from the AGON book when it comes to design and firmware.
    The AOC G90 family consists of three models: the 24.5-inch G2590VXQ, the 24.5-inch G2590PX, and the 27-inch G2790PX. All three are based on TN panels featuring an FHD resolution, 250-400 nits brightness, 1000:1 contrast ratio, 1 ms GtG response time, and 170°/160° horizontal/vertical viewing angles (which is just what you come to expect from inexpensive TN LCDs). All three monitors share a similar design with rather thin bezels and “gaming red” inlays, but there are some differences when it comes to how the stand adjusts. All three support AMD’s FreeSync dynamic refresh rate technology with low framerate compensation (LFC), with a 30-144 Hz range for the G2590PX and G2790PX as well as 30-75 Hz for the cheapest G2590 VXQ. Finally, all three support AOC’s low input lag feature as well as Shadow Control enhancement to make dark scenes brighter.
    The G90 lineup from AOC is aimed at a wide audience of gamers and support a variety of inputs, including D-Sub, DisplayPort, HDMI (two of these, probably to simplify connection of game consoles) and even MHL for the 24.5” models.
    Specifications of AOC's G90 Series Gaming Displays
    G2590VXQ G2590PX G2790PX
    Panel 24.5" TN 27" TN
    Native Resolution 1920 × 1080
    Maximum Refresh Rate 75 Hz 144 Hz
    Dynamic Refresh Tech AMD FreeSync with LFC
    Range 30 - 75 Hz 30 - 144 Hz
    Brightness 250 cd/m² 400 cd/m²
    Contrast 1000:1
    Viewing Angles 170°/160° horizontal/vertical
    Response Time 1 ms GtG
    Pixel Pitch 0.2825 mm² 0.3113 mm²
    Pixel Density 90 PPI 81 PPI
    Color Gamut Support sRGB
    Inputs 1 × DisplayPort 1.2
    2 × HDMI 1.4
    1 × MHL
    1 × D-Sub
    1 × DisplayPort 1.2
    2 × HDMI 1.4
    1 × D-Sub
    USB Hub - 4-port USB hub
    Audio 2 W × 2
    headphone output
    Proprietary Enhancements AOC Low Input Lag
    Shadow Control
    Power Consumption Idle 0.5 W
    Operating 25 W 28 W 32 W
    Stand Adjustments Tilt -3.5º/19.5º
    Swivel - -20º/20º
    Height - 130 mm
    Pivot 90º
    VESA Mounts 100 × 100 mm
    Launch Timeframe January 2018 February 2018 December 2017
    Launch Price €179 in Europe
    £159 in the UK
    €279 in Europe
    £249 in the UK
    €349 in Europe
    £309 in the UK
    Additional Information Link Link Link
    Check Availability Amazon.com Amazon.com Amazon.com
    The AOC G2790PX is already available from select retailers in Europe for €349/£309; the G2590X, its younger brother, will hit the market in February and will cost €279/£249. The most affordable G2590VXQ will ship next month with a €179/£159 price tag.
    Pricing of the new AOC G90-series monitors in the USA is to be announced at a later date. In the meantime, AOC’s partners are selling off previous-gen gaming monitors with discounts right now. For example, the AOC G2460PF that strongly resembles the G2590PX (24” TN, FHD, 350 nits, 1000:1, 144 Hz, 1 ms GtG, FreeSync, DVI, D-Sub, HDMI, adjustable stand, no Shadow Control) can now be purchased for $205 from Amazon.
    Buy AOC G2460PF on Amazon.com
    Related Reading




    More...

  10. RSS Bot FEED's Avatar
    Join Date
    09-07-07
    Posts
    34,807
    Post Thanks / Like
    #7670

    Anandtech: Apples Issues Updated Statement On Battery Ageing

    Following the attention that Apple had gotten over the past few weeks regarding the discovery of mechanisms that reduce CPU frequency on devices with aged batteries, Apple has now issued a more comprehensive statement and apology addressing the matter:
    First and foremost, we have never — and would never — do anything to intentionally shorten the life of any Apple product, or degrade the user experience to drive customer upgrades. Our goal has always been to create products that our customers love, and making iPhones last as long as possible is an important part of that.
    …
    When power is pulled from a battery with a higher level of impedance, the battery’s voltage will drop to a greater degree. Electronic components require a minimum voltage to operate. This includes the device’s internal storage, power circuits, and the battery itself. The power management system determines the capability of the battery to supply this power, and manages the loads in order to maintain operations.
    The statement doesn’t address any new information as to cause of the issue and confirms my initial technical explanation of the battery impedance causing the battery to no longer be able to supply a stable voltage supply during transient loads.
    What we do have as new information is the various other effects that the throttling mechanism touches:

    • Longer app launch times
    • Lower frame rates while scrolling
    • Backlight dimming (which can be overridden in Control Center)
    • Lower speaker volume by up to -3dB
    • Gradual frame rate reductions in some apps
    • During the most extreme cases, the camera flash will be disabled as visible in the camera UI
    • Apps refreshing in background may require reloading upon launch
    Some of these points regarding frame rate reduction might also point out that iOS is also limiting the GPU frequency, among other unexpected limitations such as lower speaker volume and screen dimming.
    Based on the large media attention and relatively negative feedback which prompted Apple to this second official response and statement, Apple promises three key points to address consumer’s concerns:

    • Apple is reducing the price of an out-of-warranty iPhone battery replacement by $50 — from $79 to $29 — for anyone with an iPhone 6 or later whose battery needs to be replaced, starting in late January and available worldwide through December 2018. Details will be provided soon on apple.com.
    • Early in 2018, we will issue an iOS software update with new features that give users more visibility into the health of their iPhone’s battery, so they can see for themselves if its condition is affecting performance.
    • As always, our team is working on ways to make the user experience even better, including improving how we manage performance and avoid unexpected shutdowns as batteries age.
    Reduction of the cost of an official battery replacement from $79 to $29 is a much welcomed change that makes this a much more attractive option considering replacement batteries only cost $10-15 depending on model; Apple’s previous pricing at $79 was extremely extortionary given the critical aspect of this service. I would now recommend any users who hesitated on replacing their iPhone batteries on their own to make use of the official service as it will have very noticeable impact both on device battery life as well as device performance (due to the nature of this story).
    The way that Apple has handled disclosure on the throttling mechanisms has also been heavily criticised as users felt their devices slowing down with iOS updates and not knowing the reason. Here Cupertino promises key changes in the way that iOS handles information sharing on battery health and reporting, as well as promised improvement on performance management under degraded battery conditions. The issued time-frame for when we can expect these updates are “early 2018”.
    Overall the response from Apple was the only possibly correct one to the whole fiasco, and the only one which was to be realistically expected, though it took longer than it should have to implement changes such as drastically reducing the battery replacement cost.


    More...

Thread Information

Users Browsing this Thread

There are currently 27 users browsing this thread. (0 members and 27 guests)

Tags for this Thread

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
Title