Intel has been developing processors with integrated graphics for a long time. But it was not always.
Would it make sense to develop a CPU without GPU again for those who aren't going to use the integrated graphics at all?
Then Intel could add the additional cores to the vacant places. It could increase the performance of the systems for the gamers or developers, couldn't it?
Thank you for your opinions.
I have posted the picture what I see then I opened your link. I see some news...
But now I see these processors which are mainly server or X-series.
What about for home users and not so expensive as X-series.Intel® HD Graphics 530
For example, I use i7-6700. It has Intel® HD Graphics 530, but I don't use it. Now I'm waiting for the i7-9700 and it will have probably Intel® UHD Graphics 630 which I will not use too.
I have adjusted the filter to show only intel core processors:
https://ark.intel.com/Search/FeatureFilter?productType=processors&IntegratedGraphics=false&FilterCurrentProducts=true&FamilyText=Intel%C2%AE%20Core%E2%84%A2%20Processors Intel® Product Specification Advanced Search
The answer to your other question is not so simple. First of all, developing a separate silicon design to do nothing more than remove the graphics engine would simply not be economical. Nor would it necessarily result in cooler processors, faster processors or processors with more cores. There is no space barrier limiting the core count (i.e. it is not a graphics engine vs. cores-type decision). There are, however, many (many!) other factors (power consumption, thermal margins, cache architecture, core latency, etc. and etc.) that do affect this decision. If you have a desktop system with an add-in graphics card, then the graphics engine in the processor will usually be disabled as a result (this is the default in most board BIOSs) and any affect that it might have on processor performance is thus eliminated.
Hope this helps,