Welcome back to another week for the geeks.
As for the Google Search patents, it’s been pretty quiet, but I’ve seen a few in the past few weeks that were worth sharing.
So let’s get started.
Latest interesting Google patents
Query a data graph using natural language queries
- Saved: March 13, 2013
- Awarded: October 20, 2020
“The implementations include systems and methods for querying a data graph. One example method includes receiving a machine learning module trained to build a multi-feature model for a query, each feature representing a path in a data graph. The method also includes receiving a search query containing a first search term, associating the search query with the query, and associating the first search term with a first entity in the data graph. The method may also include identifying a second entity in the data graph using the first entity and at least one of the multi-weighted features and providing information regarding the second entity in a response to the search query. Some implementations may also include training the machine learning module, for example by generating positive and negative training examples from a response to a query. “
Read on below
It is interesting that this was submitted back in 2013.
Because they are semantic elements, diagrams and entities.
Many SEO pros back then had no idea what that stuff was because it was rarely talked about.
In fact, to this day, many SEO folks don’t really understand how Google handles semantics.
Hell, recently I still see her talking about archaic approaches like LSI.
To say that much of the organic search profession is really lagging behind when it comes to how search actually works these days.
At the heart of this patent is the discussion of how in the past a lot of entity relationships and graphing data were actually cobbled together manually (can you imagine?), And they wanted to automate this more through machine learning.
This is 2013 too, my friends.
It shouldn’t have been a topic of conversation for the past few years … but it was.
Let’s look at some points of interest.
Read on below
“(…) In a data diagram, entities such as people, places, things, concepts, etc. can be stored as nodes, and the edges between nodes can show the relationship between the nodes. In such a data diagram, the nodes “Maryland” and “USA” can be connected by the edges of “In Country” and / or “Has State”. “
“The knowledge extracted from the text and the data graph is used as input to train a machine learning algorithm that can be used to predict tuples for the data graph. The trained machine learning algorithm can generate multiple weighted features for a given relationship, each feature providing an inference on how two entities might be related. “
“In some implementations, questions can be answered in natural language from the data graph. In such implementations, the machine learning module can be trained to map features to queries and the features that are used to provide possible query results. The training may include using positive examples from search records or from query results obtained from a document-based search engine. The trained machine learning module can generate multiple weighted features, each feature representing a possible query response represented by a path in the data graph. ”
Search and retrieve structured information cards
- Saved: October 26, 2020
- Awarded: November 3, 2020
“Methods, systems, devices, including computer programs, encoded on a computer storage medium to facilitate the identification of additional trigger terms for a structured information map. In one aspect, the method includes actions to access data associated with a template for presenting structured information, the data being accessed referring to (i) a tag term and (ii) a value. Other actions may include obtaining a candidate label term, identifying one or more entities associated with the label term, identifying one or more of the entities associated with the candidate label term, and for each particular one of the one or more entities associated with the candidate Label term are assigned and the candidate label term (i) a label term that is assigned to the respective entity, and (ii) the value associated with the label term. “
There’s nothing really earth-shaking here, but it does give us an idea of how information cards, entities, knowledge bases, and structured data can work together.
Read on below
To me, it’s another example of how SEO has changed over the years, and far more than what practitioners and publishers in the business seem to notice.
“(…) A card trigger term identification unit is provided which can identify additional trigger terms for a structured information card. With the card trigger identification unit, the grammar of one or more structured information cards can be adapted over time by evaluating candidate terms for possible inclusion in the grammar of a structured information card. “
“Suppose the grammar for a structured information card” film “contains the terms” film time “,” confirmation of the cinema ticket “and” confirmation number of the card. “The identification unit for the trigger term of the card can parse the terms associated with the grammar of the structured information card “Film” and one or more candidate queries are connected, and an additional trigger term for the structured information card “film” such as the trigger term “cinema ticket.” Accordingly, subsequent inquiries that contain incoming terms such as “film time”, “cinema ticket” or both resolve the Display a structured information card “film” in response to such requests.
Read on below
That’s it for this week folks.
As always, never forget the depth of how search engines work and keep pushing the boundaries of your learning and strategies.
Until next week!
Featured image: Created by the author, November 2020
In-post images: USPTO