The techniques for depositing the thin film absorber layer of a CIGS device are well known, with evaporation and sputtering leading the way and printing pushing hard to catch up.
However the data from recent production runs indicates that evaporation has a commanding lead in cell efficiency. While companies using evaporation are all reporting efficiencies of 14-17% (including NREL's record of over 19%), all those companies that are using sputtering are reporting less than 10% efficiency in production.
The reasons for this difference are unclear and have spawned a variety of theories, including:
1. Sputtering and Evaporation form thin films of different structures and stress levels. What part does this fact play in the efficieny struggle?
2. Sputtering could be causing substrate damage or thin film dislocation due to higher kinetic energy. Is the damage real? How bad is the damage? Could this damage be hurting efficiencies?
3. Sputtering does not lend itself to a process that creates complete and uniform Selinization as does evaporation. It is well known that the complete and uniform incorporation of Se into the thin film matrix is critical to the formation of a good CIGS absorber. Could this be a significant factor in efficiency differences?
There are several other anecdotal ideas that have yet to be studied or quantified that could hold all or part of the truth as well.
I would invite any and all theories to be discussed here. The more information that can be disseminated on this topic, the faster we will be able to discover the reason for this discrepancy and perhaps cause the breakthrough that is so urgently needed.
Therefore, I pose a question to all of you in the CIGS industry; Can sputtering have a breakthough that will propel this technology into the efficiency levels of evaporation or beyond, or are we stuck in the doldrums of a segmented market with sputtering bringing up the rear?