Tuesday, September 19, 2017

HTC will introduce three new phones by the end of 2017

HTC will introduce three new phones by the end of 2017


At the beginning of 2017 HTC official said the Taiwanese company . A member of the HTC community and twitter user Corn Chen now shared the codenames and some specs of all the devices due to launch by the end of 2017. We are expecting Ocean Master, Ocean Harmony, and Ocean Lite. According to Chen, the Ocean Master would come with 6 screen and two cameras one with 12 MP sensor and another with 8 MP, probably the first on the back, while the second being the selfie snapper. The Master will have two memory options 4 GB RAM + 64 GB storage and 6 GB RAM + 128 GB storage. The Ocean Lite and Ocean Harmony will have only one version with 4 GB RAM and 64 GB storage. The difference between them is that the Lite will have a 5.2 screen, while the Harmony will have a 6 panel. The Ocean Master and Ocean Lite are reported to be announced in November, while the Ocean Harmony would eventually arrive in the last month of the year, right before Christmas. None of those three phones is the U11 Life that .


Samsung Galaxy J5 (2016) and Galaxy Tab A with Android 7.1.1 on board spotted in benchmarks

Samsung Galaxy J5 (2016) and Galaxy Tab A with Android 7.1.1 on board spotted in benchmarks


A couple of Samsung Galaxy devices have been spotted in benchmark listings. Specifically, smartphone and tablet have appeared on both GFXBench and Geekbench, revealing Android 7.1.1 Nougat OS. Now, both Galaxy J5 (2016) and Galaxy Tab A running Android Nougat have WiFi certification, but the OS version their respective certification listings revealed was 7.0. So currently, it's difficult to say exactly which version will be rolled out finally. However, what's all but officially confirmed is that Nougat for these devices is coming soon.


Monday, September 18, 2017

T-Mobile will raise its monthly soft-cap to 50GB

T-Mobile will raise its monthly soft-cap to 50GB


Ever since smartphones began demanding more data, smartphone providers had no choice but to implement a soft data cap or deprioritize heavy data users in favor of those who use less data. Currently, T-Mobile throttles customers who use more than 32GB of data within one billing cycle. Other carriers are in the general ballpark area of 25GB to 30GB for a soft-cap. According to a leaked internal document obtained by T-Mobile will begin implementing a new soft-data cap of 50GB per billing cycle. T-Mobile calls this its Fair Usage Threshold. The new data soft-cap is to become effective September 20. T-Mobile likely has no doubt that it can handle the extra capacity. T-Mobile is likely doing this to get people to switch from legacy plans over to T-Mobile ONE, not to mention to entice people to switch to the magenta side. If youre on a legacy plan, you are likely to save if you switch to T-Mobile ONE. After all, T-Mobile has made new perks and promotions only eligible to ONE customers in hopes of this.


More than 80% of new Apple Watch preorders are cellular models

More than 80% of new Apple Watch preorders are cellular models


One of the most notable Apple analysts, Ming-Chi Kuo, had predicted that about 30% to 40% of pre-orders of the new would be for the new cellular enabled version of the new Apple Watch. Well, on Monday, the KGI Securities analyst estimated that the LTE-enabled version of the Apple Watch Series 3 accounts for between 80% and 90% of all Apple Watch Series 3 pre-orders. Pre-orders for the new Apple Watch, Apple TV 4K, and the and all opened this past Friday, September 15. While iPhone 8 and 8 Plus supply isnt scarce this year as folks are likely waiting for the to drop, but Apples new always-connected smartwatch is flying off the virtual pre-order shelves. In fact, most Series 3 + Cellular models are backordered 3-4 weeks as of Monday with initial batches shipping on September 22. The Apple Watch Series 3 is going to spike growth for the smartwatch industry. Weve already seen Samsung do well with its smartwatch platform and improvements are noticed, but Google really needs to step it up if it wants a future with Android Wear 3.0. |


Samsung to build its own 1,000 fps camera to challenge Sony

Samsung to build its own 1,000 fps camera to challenge Sony


Sony was the first to build a mobile the Motion Eye camera on the Xperia XZ Premium and XZs (later XZ1 too). Now chatter from Korea suggests that Samsung Semiconductors is looking to build a similar camera in November, ready to use in the next-generation Galaxy S phone. The advantage of on-chip memory is that the camera can store many frames fast enough to shoot 1,000fps or so for slow-motion video. Streaming those to the main RAM will be too slow, so Sony built a three layer chip pixels, control logic and memory. Samsungs design is reportedly slightly different. It uses a traditional two layer chip to which a DRAM chip is bonded. Apparently this is to avoid infringing certain patents. While Samsung's design is not as sophisticated as Sony's, the company has an advantage it has in-house factories that produce both image sensors and memory chips. Sony has to rely on Micron for the 1 gigabit memory chip. Currently, Samsung uses a mix of Samsung and Sony-made sensors on its phones. For example, Korean Galaxy S8 phones come with a Samsung image sensor, the US models have a Sony sensor. The S9 could use all-Samsung sensors if this pans out. (in Korean)


Saturday, September 16, 2017

Weekly poll results: give us stock Android or we're gonna install Nova Launcher!

Weekly poll results: give us stock Android or we're gonna install Nova Launcher!


We hear it time and time again and confirms it people prefer stock Android. Overwhelmingly so, this option got a cool 42% of the vote while the second place Samsung was way behind with 15%. Android gained extensive customization features a while ago it started with custom launchers, lockscreens and keyboards and ended with native theme support. These days you can tweak stock, non-rooted Android quite a bit... unless a manufacturers skin gets in the way. We have a tie for 3rd place Motorola and Xiaomi at 10% (were even willing to include Sony at 9%, its close enough). By the way, in the comments you heaped the praise on Nova Launcher indeed, its a highly capable, very customizable launcher. Unfortunately, there are limits to what even Nova can do (e.g. Xiaomi/MIUI can be overly aggressive in killing background apps).


Understanding HDR: cameras and displays

Understanding HDR: cameras and displays


HDR. You may have come across this term several times, perhaps even in the course of just today. It seems it is everywhere now and everyone is talking about it. But what is HDR? It's in your TV, it's in your phone. It may even be in your camera but is that even the same? And what even is color gamut? What do people mean when they say whites are whiter and blacks are blacker and who is this Dolby Vision person? It's time to find all that out. To understand HDR or High Dynamic Range, we must first understand dynamic range. Dynamic range of anything is the difference between the highest and the lowest value of something. While it is used in multiple applications, the dynamic range we will be discussing today pertains to light. The dynamic range of an optical system is the difference between the highest and lowest value or intensity of light it can detect. The wider this range, the more detail the system can capture. A system with a particularly wide range is called a high dynamic range system. The human eye has a reasonably wide dynamic range. With our eyes, we can look at a scene and see the details in the brightly lit and the dimly lit areas with relative ease. It's only when a scene has an intensely bright object that our eyes have to adjust by narrowing the iris, at which point it can only really see the bright object and everything in the darker areas of the scene fades away. If our eyes had a wider dynamic range, we wouldn't have to squint at bright objects and could see them comfortably. Similarly, we wouldn't have to strain our eyes so much in the dark while animals such as mice can comfortably see in much less light. Now let's apply the same logic to a camera system. Just like with the eye, the dynamic range of a camera system is the highest and lowest values of light the system can capture at any given moment. Cameras with wide dynamic range are obviously better but they also tend to be more expensive. Conversely, cheaper cameras or those that are physically smaller (smartphone cameras, for example) generally have worse dynamic range. In cameras, the dynamic range is more important than it is with our eyes. With our eyes, we can only look and focus at one particular object at a time. Even if we see the entire scene, our eyes are only focused on the object in the center so even if there is something in the corners that is not properly lit it doesn't really matter because as soon as we shift our gaze there our eyes will adjust to that. With a photograph, we can choose to see different parts of the picture and because they are captured permanently with a certain set of parameters they won't adjust simply because you choose to look at a different point on the image later. For this reason, a wide dynamic range is a much sought-after feature in cameras. A high-quality camera system should be able to expose correctly for the bright as well as dark areas of the scene. The image sensor on a good camera can capture enough detail in both, the bright as well as the dark areas of the image. All this light information is often stored in the RAW file, which can later be used to bring out the details in the highlights (the brightest parts of the image) and the shadows (the darkest part of the image) by turning down the former and increasing the latter. However, there is a limitation to this method and a camera can only capture so much detail in one go in all areas of the image. This is where tone mapping comes in. We have all come across this button in our phone's camera app. Pretty much every phone these days has an HDR option and most of us just choose to leave it on or on Auto. This HDR mode is actually a misnomer for a technology called tone mapping. What this does is create an image that has details in the brightest as well as the darkest areas of the scene. It can do this by processing a single high-quality image or more commonly, by capturing multiple images at different exposures and combining them. In the latter method, the photographer first sets the exposure (essentially the brightness levels) of the image low and takes a shot. Then several more shots are taken, gradually increasing the exposure levels while keeping the camera steady. Now you have multiple set of images, with the low exposure shots having great detail in the brightly lit areas of the image but all the darker parts are completely black and the high exposure shots having great detail in the darker areas but all the bright parts are blown out. You can probably see where this is going from here. The photographer then puts all these images in an image editor and superimposes them, which creates a final image that has detail in both, the dark as well as bright images. Our modern smartphone cameras do all of this for us automatically. They take a bunch of images at different exposures and combine them together to create the HDR image. Some may choose to capture a single image and just stretch the shadows and bring down the highlights to achieve a similar effect. But none of this is true HDR. You see, even though the image has more details in the shadows and highlights, it has been artificially added there. This is because most of us don't have wide dynamic range monitors or displays so all the high dynamic range content has to be compressed to fit the limited dynamic range of our displays. And because the image isn't naturally high dynamic range but still has details in shadows and highlights, it also looks unnatural and over processed. With a true high dynamic range display, you would have been able to see the details in the highlights and shadows of the aforementioned RAW image easily but because most of us don't have an HDR displays, we have to artificially bring down the highlights and pull up the shadows to match the limited dynamic range of our displays. You must have seen television manufacturers claiming HDR support on their latest 4K televisions. Even smartphones are now starting to ship with HDR displays. The first one was the ill-fated Samsung Galaxy Note7 last year but since then we have had the Galaxy S8 and the Galaxy Tab S3, the Xperia ZX Premium and Xperia XZ1, the LG G6 and V30 and most recently, the iPhone X. So what is an HDR display? An HDR display has three advantages over a standard dynamic range display (let's just call it SDR, even though that's not an official term): The first and the second are closely related. The display is capable of showing more detail in the brighter and darker areas of the screen. This is where the often touted but seldom explained whites are whiter and blacks are backer adage comes in. When you look at HDR content (more on this later) on an HDR display and compare it side by side with SDR content on SDR display, you will notice that the brighter areas of the image are brighter. However, at the same time you can actually see more detail there. For example, if there is a shot of someone standing next to a window with bright light coming in, the side of the face facing the window will look overexposed on the SDR display and will just appear white. The same spot on the HDR display will look brighter but even through that you will be able to see the texture and details on the skin without the bright area just looking like a white glowing spot. Same thing is with the shadows. A dark night shot on an SDR display will have some areas of the image, such as hair or a dark jacket, just appear black but on the HDR display you will be able to make out more details and see the texture. This is where the high brightness helps. The increased brightness elevates the whole image and lets you see more details. You can ask why can't they do the same with SDR display but with an SDR display, increasing the brightness will just make the image wash out without adding any more detail. The third aspect is wider color gamut. Our eyes can see a certain range of colors. Unfortunately, due to various restrictions in transmitting data, whether it's over the television or the internet, the images we see on our screen use a significantly smaller subset of colors than what our eyes can see. With a wider gamut of color, we are effectively increasing the range of colors the image has. It's still not close to the limits of what our eyes can see but it's still better than an SDR image. What this means is that images appear more lifelike as you can now see a wider range of colors on your screen. A tomato in real life looks intensely red and vivid but bland on screen because the display and the format simply did not have enough range of colors to reproduce the object accurately. With HDR, it will look closer to the real-life version, if not quite the same. To clarify, a wider gamut is not more saturated colors. It's not the same as increasing the saturation on your display. Increasing the saturation simply increases the of the color. It does not show you color. A wider color gamut lets you see more shades of color, which increasing saturation cannot achieve. This is the difference between an oversaturated display and a wide color gamut display. We now know the display side of the story but there is the content side as well, which we will discuss below. An HDR display is only an HDR display if it is showing HDR content. Without that it just goes back to being a really good SDR display. HDR content is currently available in two major formats, HDR-10 and Dolby Vision. While these are referred to as formats, they use existing codecs such as H.264 or HEVC and existing containers such as MP4 or MOV but have additional metadata in the file to distinguish itself to HDR systems. An HDR format file played back on an SDR display will look flat, with low contrast and low color, as the system cannot display light and color information outside of its own range. HDR-10 is an open industry standard created by the Consumer Technology Association whereas Dolby Vision is a proprietary standard created by Dolby. Think of HDR-10 as USB-C and Dolby Vision as Lightning and you should get the picture. HDR-10 is the most commonly used format because it is free to use and does the job. Everything that claims to support HDR uses HDR-10 while some also support Dolby Vision in addition to HDR-10. To be HDR complaint, the content has to be mastered in a certain way. While existing content can be tweaked in post-production to be HDR-ready, just like with 3D, the best way to get HDR content is to record it that way. Now, there aren't any HDR cameras out there but all the high-end video cameras used by professionals, such as the RED, ARRI or Blackmagic cameras, capture enough dynamic range and color information by default that the footage can easily be converted into HDR video. While editing, this footage would normally be compressed to fit the narrow dynamic range and color gamut of SDR publishing but for HDR the coloring artist could leave a lot more of the dynamic range and color information in the final edit. Depending upon which format they choose to go with, it could be mastered in HDR-10 or Dolby Vision. The unique thing about Dolby Vision is that Dolby has full control of the entire pipeline of the content, right from working with the content creators for mastering it to where and how it is displayed. With HDR-10 you can mess around slightly with your video settings but with Dolby Vision, the settings are locked to what Dolby wants you to see. With Dolby Vision, the visual settings are dynamically altered for every scene using preset metadata for the best image quality. Dolby also has significantly higher requirements for hardware, with displays having to meet certain color and brightness requirements that are higher than HDR-10. With such tight control of the proceedings and high minimum requirements, the quality is generally higher on Dolby Vision content but it also results in less content in general. It's also why we got about eight phones with HDR in a span of a year but only one with Dolby Vision, and why only a handful of expensive, absolute top of the line televisions have Dolby Vision. Two of the main ways to acquire HDR content today is via Blu-ray discs and streaming services. Blu-ray, specifically 4K Blu-ray, is where you will get the best quality. Blu-ray discs have uncompressed video and audio and is just the absolute best way to enjoy your movies or television shows. However, what most people will end up using is streaming services, especially since that is the only option available on mobile. Here, companies like Netflix, Amazon and YouTube rule. Netflix, in particular, has one of the largest libraries of HDR content on the internet. Netflix is also the only one to have HDR-10 as well as Dolby Vision content. Amazon would be a close second along with Hulu. YouTube recently started supporting HDR content and Google also added some HDR movies to its Google Play Movies services. This week, Apple also threw its hat in the 4K HDR ring with the announcement of the Apple TV 4K, iPhone X and iTunes 4K content. However, even with these many services at your disposal, the amount of HDR content is still limited. Not all the content on the aforementioned services is in HDR. Netflix even makes you pay for its four-person plan for you to access its 4K HDR library, even if you are the only person using it. Then depending upon your region, much of this already limited library could be further restricted. Some of these services won't be offering HDR in your region at all. As such, getting your hands on HDR content right now isn't easy. However, things are slowly improving and as HDR hardware becomes more accessible, the content situation should improve as well. As an aside, there is also a third way of getting HDR content, and that is in games. Currently, PC, PS4 and Xbox One support HDR games that have all the aforementioned advantages of HDR video. However, there are currently no HDR games on the mobile platform. In summary, HDR is all about increasing the quality of your content. Previous advancements in video technology were primarily about increasing the resolution but HDR is where the advancement happens on the pixel level, meaning it's less about more pixels and more about better pixels. The brighter, more dynamic and vibrant HDR image is far more obvious even to the novice eye than a resolution bump, which may or may not be obvious depending upon your visual acuity or distance from the screen. It must be said that it still depends upon two major factors, quality of the HDR panel and mastering of the content. Cheaper HDR televisions are obviously nowhere as good as the most expensive ones and just like with 3D, some of the HDR content is clumsily mastered with absurd colors and contrast to make it pop more. However, because of the requirements of HDR, even a half decent HDR panel will automatically be better than an SDR panel. And as for smartphones, considering only the flagship phones currently have it you can expect the displays to be good in general. And if you choose to go with Dolby Vision, you can be especially sure of the quality because of all the work Dolby has put in. And it's worth repeating that the HDR displays and technology has nothing to do with the fake HDR button in your phone's camera app and that the latter is largely bogus while the former is true HDR. I hope this helped clear some of the confusion surrounding HDR. In the coming days we will be seeing a lot more of this technology but for once, it is actually useful and not a gimmick so I personally will be looking forward to it.