I made it for myself a few months ago and have been doing small updates since. It allows me to combine word lists and do some analysis on the results. The results being unregistered domains. In its current state, it provides the GD appraisal and the number of tlds the name is registered under. I wouldn't release it with those options as it would strain the GD servers. I'm considering adding the option to provide your own API key to an API that offers the number of tlds registered for a certain name. More on appraisals below. Only .com, .net, and .gov are supported as the free WhoisClient provided by Apache is just Verisign. Adding more tlds is possible but I am not sure what the impact on speed for those will be. I also remember whois for other extensions not being free. It's useful in gauging demand, at least, in certain patterns. See example below. I'd include some of my lists but not the best and most extensive. For example, late last month I registered InsuranceGecko in king. I had a list of popular brand animals and had the program prepend "insurance" to them. Out of the 21 combinations, only InsuranceJaguar.com was available. Unavailable names included InsuranceSpider and InsuranceZebra. Here is a thread I created to share some free available domain lists that I found with the tool: https://www.namepros.com/threads/br...com-premiumsend-com-herfurniture-com.1171648/ It doesn't have a UI. You can run it from a terminal on your computer like windows command line or powershell. I have not tested it on Mac but it should only require a few edits to make functional. It is written in Java. Some functionality: - It can take a single domain or file with a list of domains and return the registration dates of each name. It can do the opposite as well and return whichever names are available. - You can combine multiple lists as prefixes to multiple lists as suffixes - After lists are combined, you can choose to view the frequency of prefixes or suffixes in the available results. The higher the frequency, the less demand for that keyword. Generally these are broader results because you are combining two or more word lists at once Things I'd add to make it more user friendly: It is currently single-threaded and can check the availability of about 10 names/second. I've been thinking of the best way to make it multi-threaded and faster. I am working on an offline appraisal algorithm that uses data from just under 1 million historical sales. It does not involve machine learning. In it's simplest state, it will only appraise 2 word domains (Or provide a general reflection of the historical sales data. Not weighed by year) I am working on a way to filter large lists (pending delete, GD closeouts, etc.) by scoring domains based on historical sales data, syllables, length, and more. The core is done and I am now just trying to find better ways to score the names. I do not plan on releasing this but might sell the lists. If anyone is interested in that, feel free to ask any questions. Please reply to the thread or PM if you have questions, suggestions, are interested in the data, etc.