Yes, thank you Lenovo Ideapad 3 15IML05.
Yes, one can indeed “upgrade” RAM.
Just like one can use the near-useless SATA port to add a “second” hard drive into the unit.
Well, it’s easy enough to snap in an additional 8GB SODIMM…but I’m not happy about it! The initial whateverGB is soldered on, and while the NVME m.2 or whatever is easy enough to access, and replace, the “second bonus” port is useless.
Yes, I have two of these models, identical because I can’t figure out how to attach a new power plug onto the MOBO on the last one. Nor can I find a battery that inverts AC current directly to the battery of the old one, nor am I clever enough to make one.
It’s still good. The old one will be my Win11 box. Because that’s all it’s good for!
Or, I’ll just break it by soldering a new DC input hole into the MOBO.
Well, I have a soldering iron, and schematics, plus some breadboards and wire around. I should be able to break something, dammit! Or else make the damned thing go.
/* Pro-tip: ideal holding tubes for SSD drives and loose RAM strips? Empty prescription bottles! */
Breaking is the easy one. Making it work…not so much.
My 5 year old 75in Vizio is having backlight issues, mostly in my bottom left corridor.
Started looked for my extended warranty paperwork (which I couldn’t find) and contacted Sam’s Club where I bought it… They sent me to Allstate which in turn sent me to Square Trade and bottom line, it was a waste of time because I got nowhere
So now I’m looking at doing my own repair on it sometime in the future and after watching some repair videos I’m hoping it’s not the actual back lights going bad because that looks like a real pain in the @$$ to replace… I’m leaning more towards a possible bad LED Driver board. Still gonna be a pain since I will have to remove this giant TV from it’s wall mount and set it face down on 4 coffee tables and remove the back cover to get to it.
Just debating on when I’m going to do this
I feel your pain here. I’ve had multiple light strips go bad in TVs, and replaced a few myself, and it is a hellish ordeal. A few years ago I took the plunge and bought an OLED TV, and I’m never going back. The picture is so good it will bring a tear to your eye, but more importantly there are no backlights to go bad.
I don’t understand the tech down to the atomic level, but apparently the gist is a plane coated with the organic material and vacuum sealed between layers of glass and whatnot. I did a ton of research before making the decision. Each light-emitting particle (subpixel/whatever) is rated for something like 100,000 hours, but dims slightly over time, which leads to the “burn in” risk. Nothing ever actually “burns” in, it’s just that if certain pixels are illuminated for long periods, they wear unevenly against the rest, which can create the effect of image retention. I’ve had no problem with this in 3 years so far – 3,500+ power-on hours and it still looks phenomenal.
As long as I treat it gently and don’t break the sealed screen, the logic board and power supply are the only things that can realistically fail, both of which are relatively easy to replace.
OLED has made a lot of progress in the last twenty years in terms of “half-life,” i.e., time to half initial brightness. The early displays (like seven segment numeric displays) barely made it to 1,500 hours. And it took additional years of work to develop a blue emitter. I know this because I spent a number of months working on displays for a new cell phone back in the day, where OLED was one of the contending technologies and I got a crash course in the state-of-the-art at the time.
Before much longer — two years at the outside and probably less — you’re going to see “micro LED” displays (not to be confused with “mini LED” backlit LCD displays). These have been in development for as long or longer than OLED but they’ve finally about licked the problems of fabricating near-microscopic, top-firing, non-organic LEDs operating at three different wavelengths along with the control circuitry on a single substrate. Like OLED they’ll be small displays at first, e.g., watches, status displays, etc., but they’ll get bigger. Plus they’ll be brighter than OLED displays and without their associated “half-life” and hydrophilic issues.
I’ve been following the micro LED development for awhile as well. It opens up some cool options for scaling up screens – wall-sized TVs and the like – and overcomes the brightness issue with OLED. But I’m skeptical of repairability. With a micro LED panel, whether one tiny LED dies or 10,000 of them die, they’re so small there’s no hope of replacing bad ones – you’d have to replace the whole panel.
The same could arguably be said for OLED, but I think the nature of the organic material makes it less viable for the manufacturer to deliberately use low-quality components they know will fail after X number of hours, which is what they’re doing now with backlit LED strips. Time will tell I guess. I’m typically not a fan of black-box, non-user-serviceable stuff like a screen sandwich made in a vacuum, but in this case I think it’s better than the alternative.
I’ve seen the same evolution, from fluorescent and LED sidelighting to backlighting, then OLED, mini LED and micro LED, and in principal they’re all great, and improvements over each other, but manufacturer’s goals and a TV purchaser’s goals are polar opposites. I want a TV that will last and last, but they want me to buy a new one as frequently as possible, and they’ll do everything they possibly can to ensure I must.
I feel your pain, but I’ve also been on the design and manufacturing side. Building something that’s genuinely robust is more expensive, and not just in raw material and production costs. You want something like a near-indestructible smart phone with a user-replaceable battery? It can be designed and built, no question about it. But the per unit cost, even in high volume production, would make a top-of-the-line iPhone look cheap by comparison. The dark side of computer aided design and simulated testing is that it essentially eliminates the need for “conservative” engineering. You know exactly what will work 99.99999% of the time, at which point it becomes cheaper to have the periodic failure and associated warranty costs than pursue another decimal place in reliability.
Now the military? It tends to want that sixth decimal place for various reasons, and at times for things that aren’t easy to design or build in the first place and have to immediately operate in far harsher environments than one’s living room after having been in storage for potentially a few years. Price becomes a consideration secondary to reliability.
I’d like to know what the actual truth is about increased costs for better quality. Maybe you’ve got insight with your background? I’ve read that TVs have low profit margins these days, which manufacturers are offsetting with stuff like selling ads on their screens and selling usage data, but what exactly is this “low” margin? 100 percent? 50 percent?
The cynic in me says selling TVs with crap components is a strategic decision, not a tradeoff to keep costs down or stay competitive, because I’d pay twice the price for a lasting TV built with quality components, but why would they settle for twice the profit when they can squeeze me for 10 times that over a decade replacing TVs that die every 18 months?
It probably is a strategic decision. There are simple things that can be done to improve the reliability of electronics, but they also have significant associated costs:
- Use automotive or industrial grade parts instead of consumer grade — for everything and not just the expensive components. These have passed more rigorous factory testing and will operate longer at higher temperatures, but the yields are lower and thus are proportionately more expensive. Then there are military grade parts which are even tougher (with even lower yields —> more expensive still). At the high end are space rated parts, but they also are hardened against radiation and are fabricated to minimize outgassing.
- Active, contact cooling (cold plates). Think along the lines of a high-end liquid-cooled gaming PC, but cooling everything that dissipates more than a small amount of heat. This adds not only cost, but size and noise. Plus there can be condensation issues, so the board needs to be given a waterproof coating and/or hermetically sealed in a dry chamber with desiccant that will need periodic replacement as it becomes saturated. You see these things in military electronics.
- Lower density circuit boards with an absolute minimum of vias. Redundant vias where possible. Another size increase, and difficult to achieve with modern small pin pitch ICs.
- Better power supply design that assumes all incoming power is noisy and badly regulated.
- Better hardening against electrostatic discharges. ESD damage is cumulative; a lot of small zaps can kill something just as surely as one big one.
- Extremely conservative schematic design that assumes all the components are going to be crap and the device has to function anyway.
And there are still going to be things that wear out despite not having any moving parts, like flash memory or OLED emitters.
I think my 55" LG OLED screen is 5 years old now, and over the last 2 years the number of dead pixels around the edges of the screen has been increasing. The placement is completely random except that in a few places there are clumps of multiple dead pixels in the same area.
I don’t notice it very often because most of the time what you are paying attention to in a scene is in the middle of the screen, about the only time I notice it is if there is a wide outdoor shot with blue sky and nothing to draw your eye to the center the random black dots along the top of the screen end up drawing your eye there.
I avoid watching sports on that TV and don’t use it for gaming to avoid static logos, so when I put up a solid color on screen I don’t see any trace of burn-in, I have most of the options to stop burn-in turned off since I want the best color and brightness. I thought after this much time watching widescreen content the top and bottom of the screen would be brighter due to being turned off so much but that is not an issue yet. I also watch a lot of 4:3 content so the center of the screen must have thousands more hours on it but so far no changes in brightness.
Even funnier? I’ve got plenty of those cables to spare. Here kid, take one!
Me, too, and so were some of my relatives’ computers. I bought a USB 3 to SATA/IDE converter ($25) and it “just worked” to connect old drives to a modern Mac and Windows PC. Only one older, chunkier drive wouldn’t spin up, but a higher amperage 12v AC adapter for the converter solved that problem.
I take grief from a certain someone for keeping so many cables and spare parts, but anyone whose ever been caught empty-handed after purging these items knows the mantra is “never again.” Am I likely to use this PS/2 to 9-pin-serial mouse adapter anytime soon? Nope, but you never know.
I have a couple of miniUSB-B to microUSB-B adapters that I guard zealously. They were impossible to find at the best of times, let alone now.
I rarely throw away old cords and whatnot. We have a big old box of 'em on a shelf in the basement. We refer to it as the technology graveyard.
But it came in handy the other day when the wife’s optical mouse died during work (the laser fell into the mouse somehow!) and she went down to dig out an old spare one until she could go to the office and get a new one.
With retrogaming a thing and older consoles getting harder to find since the second half of the 2010s, aftermarket consoles along with plug & play units have become big. This one - the Atari Gamestation Pro - was reviewed by GenXGrownUp on YouTube.
It does look good with all those Atari 2600, 5200, 7800 and arcade titles plus bonuses.
Still…I own an Atari 7800 (which also plays 2600 games) plus an Evercade. Not sure if I’d get one of these.