this post was submitted on 14 Jul 2025
536 points (96.9% liked)
Technology
74003 readers
3074 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related news or articles.
- Be excellent to each other!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
- Check for duplicates before posting, duplicates may be removed
- Accounts 7 days and younger will have their posts automatically removed.
Approved Bots
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
is it a real thing or an obligatory overestimated result to get grants because the system is fucked?
I just skimmed the IEEE paper (peer-reviewed, solid journal); The usage of 'slash costs' in the title is entire sensational. The tech gave a SLIGHT increase in efficiency (which is good news - marginal improvements are still very good and can be game-changing if scaled up), but there is no cost/benefit analysis in the paper regarding the additional costs of lenses and whether the increased PV efficiency would offset those costs at scale.
Honestly, we don't need the technology to get any better than it is. It's nice, but not necessary. Labor costs of deployment are the biggest limiting factor.
If you get efficiency gains of around 50% (factor 1.5 from ~20% efficiency to ~30%) with the same deployment costs, this should nonetheless make it more cost-effective.