Indium Blog

Neural Networks and Market Intelligence

Category:
  • Indium Corporation

  • Just before the New Year, I had the opportunity to sit down with my old friend and colleague Dr Conrad Sorenson. Conrad is an expert in the fusion of electronics technology, marketing, and results-oriented business strategy – a discipline he refers to as Technical Scouting, and he specializes in sorting out and quantifying trends in electronic materials.

    [Andy Mackie] The USPTO already has a search engine on its site – why is it sometimes ineffective at helping me find the patents I need?

    [Conrad Sorenson] The USPTO system is a giant Oracle database.  We interact with the database by matching keywords against its contents. Unfortunately, because there can a thousand ways to describe an innovation, using the English language to retrieve them can be quite difficult. Ambiguous results can be improved by cross-filtering using CCL codes and sub-codes (aka the US Patent Classification System, a rather old-fashioned listing of technologies which can be used to limit results to specific fields of use) or restricting inquiries to specific fields such as abstracts, inventors, assignees, etc. Unfortunately, because the database must serve the public, the USPTO limits the length of queries to the patent database, which impedes our ability to extract relevant information.

    [Andy Mackie] How does your patent search technology work?

    [Conrad Sorenson] I have automated both the initial basic search criteria and the process of downloading the results to a local database. Because filtering is done offline, my strategy has evolved to use the USPTO selection criteria broadly, to minimize the amount of pertinent intellectual property overlooked by the initial filtering process. After pertinent patent data is downloaded locally, my patent scanner confers with proprietary databases to automatically segment results into markets, technologies, and supply chains. 

     Another important feature of the scanner is that it not only looks into the databases of granted patents but it also simultaneously explores published documents in the patent application database. Patent applications are fascinating. They can reveal knowledge about the internal development investments which could only otherwise be gained under non-disclosure. According to research I performed a few years ago on Atomic Layer Deposition, patents are published within about 9 months ± 6 months from the filing date. By analyzing changes in the number of patent filings in over time, you can learn whether a technology approach is being emphasized, how it ranks with other technology development efforts, and by closely examining CCL codes, learn if the technology has shifted directions – for example from ALD process for depositing high dielectric constant materials toward diffusion barrier coatings. The downside to including patent application data is that results can often be obscured – only market share leaders clearly identify themselves as the assignee in patent applications. My scanner sniffs this information out by examining each inventor’s recent assignment information, double-checking to ensure if they are in the same general geographical location, and then determines a likely patent assignee.

          As you can tell from the above description, my development efforts continue to be around the management of an abundance of data, to winnow it to the knowledge we seek. When restrictive filters are removed in the search for patents, we have to develop new skills for managing this abundance of data. Because such techniques can yield thousands of results, even after market, technology, and supply chain segmentation, I am applying artificial intelligence concepts to the search for knowledge. I am having excellent success using a neural network to patent search results to distil highly relevant patents from a pool of mixed relevance and non-relevant IP.

    [Andy Mackie] Can you explain “neural networks” in simple terms?

    [Conrad Sorenson] Neural networks are method of teaching a computer to “think” much like a human. They emulate the functioning of the brain through the use of weighted linkages between synapses or “nodes” (see figure). 
    Network Methodology

    The style of network I am using (a back-propagation neural network) actually learns a task by comparing results against a training set, then adjusting weighted linkages between synapses until error rates drop below a threshold. I “train” the network by providing it with an exemplary set of patents classified by relevance (High, Medium, Low, and Not Relevant). The patent scanner manipulates node weights until it is able to reproduce my exemplary results, then applies this to un-reviewed patents. It automatically sorts through an almost unlimited number of patents (more than one per second) and comes up with results that are very interesting! In this manner, I am able to break through the limitations imposed by a keyword searching methodology, by duplicating and applying my judgment of patent relevance to the process.
     

    [Andy Mackie] How do you find the “key phrase” that differentiates the patents I am truly looking for, from the “chaff”?

    [Conrad Sorenson] That’s my secret: I don’t! I unfetter the USPTO keyword search process, and then apply the techniques outlined above to the results. For example, I recently did a search of patents for “semiconductor vacuum chamber seasoning” for a client. It turns out that restricting the USPTO search to only semiconductor applications culled out some very useful data. I got the best results using the keyword search terms ‘“chamber” AND “seasoning”’. As expected, the results included more references to Kentucky Fried Chicken than IBM! By segmenting results to electronics markets, I quickly achieved the results which were useful for my client.

    [Andy Mackie] As a technology marketing guy, you look at patents in a different way than intellectual property specialists or R&D professionals. Can you give us some insights from your experiences?

    [Conrad Sorenson] I do not look at patents from a traditional standpoint of intellectual property. I consider the USPTO to be a rich source of competitive intelligence. This is analogous to the use of signals intelligence to obtain important information. In World War II, after a change in codes, Allied code breakers would go for long stretches of time without being able to read our adversary’s coded messages. During these dry spells, they could still obtain highly-valuable information by determining where messages were transmitted, when they were sent, and who responded to them. Similarly, what companies are doing – or not doing – and how much they are spending to do so can be determined by information published in the USPTO database.

    Combined with more than a decade of experience looking at online patent information; I can also infer technology maturity from supply chain segmentation. Because a technology progresses in a predictable way, early IP has a supply chain which emphasizes Universities, National Laboratories, transitioning to large industrial laboratories. Materials companies then emerge, followed by OEMs and End Users. In a well-developed market such as semiconductor manufacturing, there is a ratio of 1:10:100 between materials suppliers, OEMs, and End Users. If this ratio is skewed, it indicates that there is a competitive challenge in the outlier.

    [Andy Mackie] I really appreciate your sharing this with me and our Blog readers. If you’d like to learn more, Conrad can be reached through his website , phone +1(716) 639-0721, or email him at Conrad.Sorenson@EnablingMaterials.com


    Cheers! Andy