• Hjalmar
    link
    fedilink
    arrow-up
    9
    ·
    6 months ago

    It’s very much possible and indeed such a problem that it may be done by mistake if a large enough data set isn’t used (see overfitting). A model trained to output just this one image will learn to do so and over time it should learn to do it with 100% accuracy. The model would simply learn to ignore whatever arbitrary inputs you’ve given it