Skip to content
  • Hjem
  • Seneste
  • Etiketter
  • Populære
  • Verden
  • Bruger
  • Grupper
Temaer
  • Light
  • Cerulean
  • Cosmo
  • Flatly
  • Journal
  • Litera
  • Lumen
  • Lux
  • Materia
  • Minty
  • Morph
  • Pulse
  • Sandstone
  • Simplex
  • Sketchy
  • Spacelab
  • United
  • Yeti
  • Zephyr
  • Dark
  • Cyborg
  • Darkly
  • Quartz
  • Slate
  • Solar
  • Superhero
  • Vapor

  • Default (No Skin)
  • No Skin
Kollaps
FARVEL BIG TECH
  1. Forside
  2. Ikke-kategoriseret
  3. Chief of police is going to lose their job methinks after using a report from Microsoft 365 Copilot which had errors throughout..

Chief of police is going to lose their job methinks after using a report from Microsoft 365 Copilot which had errors throughout..

Planlagt Fastgjort Låst Flyttet Ikke-kategoriseret
23 Indlæg 20 Posters 0 Visninger
  • Ældste til nyeste
  • Nyeste til ældste
  • Most Votes
Svar
  • Svar som emne
Login for at svare
Denne tråd er blevet slettet. Kun brugere med emne behandlings privilegier kan se den.
  • gossithedog@cyberplace.socialG gossithedog@cyberplace.social

    Chief of police is going to lose their job methinks after using a report from Microsoft 365 Copilot which had errors throughout.. then lying about it to press. https://www.bbc.co.uk/news/live/c394zlr8e12t

    gossithedog@cyberplace.socialG This user is from outside of this forum
    gossithedog@cyberplace.socialG This user is from outside of this forum
    gossithedog@cyberplace.social
    wrote sidst redigeret af
    #3

    Something which 100% is going to cause bad outcomes in business is you can ask Microsoft 365 Copilot to write a Word document supporting the dumbest idea ever, and it will do it, and won’t label itself Copilot generated.

    You can ask it for a 100 page document justifying not having a security team and it’ll do it. Forward that report upwards, save £3m, leave a year later with big bonus. 🫡

    niall@mastodon.nzN itisiboller@infosec.exchangeI promovicz@chaos.socialP edbo@mastodon.socialE retech@defcon.socialR 5 Replies Last reply
    1
    0
    • gossithedog@cyberplace.socialG gossithedog@cyberplace.social

      Something which 100% is going to cause bad outcomes in business is you can ask Microsoft 365 Copilot to write a Word document supporting the dumbest idea ever, and it will do it, and won’t label itself Copilot generated.

      You can ask it for a 100 page document justifying not having a security team and it’ll do it. Forward that report upwards, save £3m, leave a year later with big bonus. 🫡

      niall@mastodon.nzN This user is from outside of this forum
      niall@mastodon.nzN This user is from outside of this forum
      niall@mastodon.nz
      wrote sidst redigeret af
      #4

      @GossiTheDog brilliant! Now all I need is the job in the first place 🙂

      1 Reply Last reply
      0
      • gossithedog@cyberplace.socialG gossithedog@cyberplace.social

        Chief of police is going to lose their job methinks after using a report from Microsoft 365 Copilot which had errors throughout.. then lying about it to press. https://www.bbc.co.uk/news/live/c394zlr8e12t

        womble@infosec.exchangeW This user is from outside of this forum
        womble@infosec.exchangeW This user is from outside of this forum
        womble@infosec.exchange
        wrote sidst redigeret af
        #5

        @GossiTheDog it's always the coverup that gets you...

        1 Reply Last reply
        0
        • gossithedog@cyberplace.socialG gossithedog@cyberplace.social

          Something which 100% is going to cause bad outcomes in business is you can ask Microsoft 365 Copilot to write a Word document supporting the dumbest idea ever, and it will do it, and won’t label itself Copilot generated.

          You can ask it for a 100 page document justifying not having a security team and it’ll do it. Forward that report upwards, save £3m, leave a year later with big bonus. 🫡

          itisiboller@infosec.exchangeI This user is from outside of this forum
          itisiboller@infosec.exchangeI This user is from outside of this forum
          itisiboller@infosec.exchange
          wrote sidst redigeret af
          #6

          @GossiTheDog Adds scale to management stupidity.

          1 Reply Last reply
          0
          • gossithedog@cyberplace.socialG gossithedog@cyberplace.social

            Chief of police is going to lose their job methinks after using a report from Microsoft 365 Copilot which had errors throughout.. then lying about it to press. https://www.bbc.co.uk/news/live/c394zlr8e12t

            rhube@wandering.shopR This user is from outside of this forum
            rhube@wandering.shopR This user is from outside of this forum
            rhube@wandering.shop
            wrote sidst redigeret af
            #7

            @GossiTheDog in a sane world, this would be evidence that, at the very least, police and government departments should be banned from using 'AI'. But that won't happen with the gov all in on this bullshit.

            1 Reply Last reply
            0
            • gossithedog@cyberplace.socialG gossithedog@cyberplace.social

              Something which 100% is going to cause bad outcomes in business is you can ask Microsoft 365 Copilot to write a Word document supporting the dumbest idea ever, and it will do it, and won’t label itself Copilot generated.

              You can ask it for a 100 page document justifying not having a security team and it’ll do it. Forward that report upwards, save £3m, leave a year later with big bonus. 🫡

              promovicz@chaos.socialP This user is from outside of this forum
              promovicz@chaos.socialP This user is from outside of this forum
              promovicz@chaos.social
              wrote sidst redigeret af
              #8

              @GossiTheDog Imagine if M&A folks use the stuff! They already care very little for the morals that they know nothing about…

              1 Reply Last reply
              0
              • gossithedog@cyberplace.socialG gossithedog@cyberplace.social

                Chief of police is going to lose their job methinks after using a report from Microsoft 365 Copilot which had errors throughout.. then lying about it to press. https://www.bbc.co.uk/news/live/c394zlr8e12t

                hypostase@bsd.networkH This user is from outside of this forum
                hypostase@bsd.networkH This user is from outside of this forum
                hypostase@bsd.network
                wrote sidst redigeret af
                #9

                @GossiTheDog
                I'm not convinced the evidentiary nature of google season should be considered sufficient for police work either.

                Not that it surprises me.

                1 Reply Last reply
                0
                • gossithedog@cyberplace.socialG gossithedog@cyberplace.social

                  Chief of police is going to lose their job methinks after using a report from Microsoft 365 Copilot which had errors throughout.. then lying about it to press. https://www.bbc.co.uk/news/live/c394zlr8e12t

                  bashstkid@mastodon.onlineB This user is from outside of this forum
                  bashstkid@mastodon.onlineB This user is from outside of this forum
                  bashstkid@mastodon.online
                  wrote sidst redigeret af
                  #10

                  @GossiTheDog Not quite so straightforward - this was one stupid mistake alongside a lot of perfectly valid evidence. However, it will now be used to discredit everything and whitewash the violent and racist Maccabi fans, because that suits this govt.

                  richlv@mastodon.socialR flying_saucers@mastodon.socialF 2 Replies Last reply
                  0
                  • gossithedog@cyberplace.socialG gossithedog@cyberplace.social

                    Something which 100% is going to cause bad outcomes in business is you can ask Microsoft 365 Copilot to write a Word document supporting the dumbest idea ever, and it will do it, and won’t label itself Copilot generated.

                    You can ask it for a 100 page document justifying not having a security team and it’ll do it. Forward that report upwards, save £3m, leave a year later with big bonus. 🫡

                    edbo@mastodon.socialE This user is from outside of this forum
                    edbo@mastodon.socialE This user is from outside of this forum
                    edbo@mastodon.social
                    wrote sidst redigeret af
                    #11

                    @GossiTheDog we use LLMs for our incident reports 🙃

                    1 Reply Last reply
                    0
                    • gossithedog@cyberplace.socialG gossithedog@cyberplace.social

                      Chief of police is going to lose their job methinks after using a report from Microsoft 365 Copilot which had errors throughout.. then lying about it to press. https://www.bbc.co.uk/news/live/c394zlr8e12t

                      jackeric@beige.partyJ This user is from outside of this forum
                      jackeric@beige.partyJ This user is from outside of this forum
                      jackeric@beige.party
                      wrote sidst redigeret af
                      #12

                      @GossiTheDog except "using Microsoft Co Pilot" now also means using Outlook to send an email

                      1 Reply Last reply
                      0
                      • bashstkid@mastodon.onlineB bashstkid@mastodon.online

                        @GossiTheDog Not quite so straightforward - this was one stupid mistake alongside a lot of perfectly valid evidence. However, it will now be used to discredit everything and whitewash the violent and racist Maccabi fans, because that suits this govt.

                        richlv@mastodon.socialR This user is from outside of this forum
                        richlv@mastodon.socialR This user is from outside of this forum
                        richlv@mastodon.social
                        wrote sidst redigeret af
                        #13

                        @BashStKid @GossiTheDog Also Microslop disguising their slopmachine (Copilot) as everything else could lead to some not technical people mistaking it for a search.

                        1 Reply Last reply
                        0
                        • bashstkid@mastodon.onlineB bashstkid@mastodon.online

                          @GossiTheDog Not quite so straightforward - this was one stupid mistake alongside a lot of perfectly valid evidence. However, it will now be used to discredit everything and whitewash the violent and racist Maccabi fans, because that suits this govt.

                          flying_saucers@mastodon.socialF This user is from outside of this forum
                          flying_saucers@mastodon.socialF This user is from outside of this forum
                          flying_saucers@mastodon.social
                          wrote sidst redigeret af
                          #14

                          @BashStKid @GossiTheDog ah yes

                          > Assistant Chief Constable Mike O'Hara said: "We got a lot of information and intelligence to suggest that people were going to actively seek out Maccabi Tel Aviv fans and would seek violence towards them."

                          1 Reply Last reply
                          0
                          • gossithedog@cyberplace.socialG gossithedog@cyberplace.social

                            Chief of police is going to lose their job methinks after using a report from Microsoft 365 Copilot which had errors throughout.. then lying about it to press. https://www.bbc.co.uk/news/live/c394zlr8e12t

                            zl2tod@mastodon.onlineZ This user is from outside of this forum
                            zl2tod@mastodon.onlineZ This user is from outside of this forum
                            zl2tod@mastodon.online
                            wrote sidst redigeret af
                            #15

                            @GossiTheDog

                            If the Chief Constable was misled by his staff he may have been mistaken but not lying.

                            1 Reply Last reply
                            0
                            • gossithedog@cyberplace.socialG gossithedog@cyberplace.social

                              Something which 100% is going to cause bad outcomes in business is you can ask Microsoft 365 Copilot to write a Word document supporting the dumbest idea ever, and it will do it, and won’t label itself Copilot generated.

                              You can ask it for a 100 page document justifying not having a security team and it’ll do it. Forward that report upwards, save £3m, leave a year later with big bonus. 🫡

                              retech@defcon.socialR This user is from outside of this forum
                              retech@defcon.socialR This user is from outside of this forum
                              retech@defcon.social
                              wrote sidst redigeret af
                              #16

                              @GossiTheDog "The god you create is the god you deserve."

                              1 Reply Last reply
                              0
                              • gossithedog@cyberplace.socialG gossithedog@cyberplace.social

                                Chief of police is going to lose their job methinks after using a report from Microsoft 365 Copilot which had errors throughout.. then lying about it to press. https://www.bbc.co.uk/news/live/c394zlr8e12t

                                lionelb@expressional.socialL This user is from outside of this forum
                                lionelb@expressional.socialL This user is from outside of this forum
                                lionelb@expressional.social
                                wrote sidst redigeret af
                                #17

                                @GossiTheDog

                                Whoever allowed the use of Copilot by the police should be sacked.

                                1 Reply Last reply
                                0
                                • gossithedog@cyberplace.socialG gossithedog@cyberplace.social

                                  Chief of police is going to lose their job methinks after using a report from Microsoft 365 Copilot which had errors throughout.. then lying about it to press. https://www.bbc.co.uk/news/live/c394zlr8e12t

                                  heretochewgum@fosstodon.orgH This user is from outside of this forum
                                  heretochewgum@fosstodon.orgH This user is from outside of this forum
                                  heretochewgum@fosstodon.org
                                  wrote sidst redigeret af
                                  #18

                                  @GossiTheDog

                                  You seriously think:
                                  "Chief of police is going to lose their job"?

                                  Ah bless!

                                  1 Reply Last reply
                                  0
                                  • gossithedog@cyberplace.socialG gossithedog@cyberplace.social

                                    Chief of police is going to lose their job methinks after using a report from Microsoft 365 Copilot which had errors throughout.. then lying about it to press. https://www.bbc.co.uk/news/live/c394zlr8e12t

                                    lutherblissett@kolektiva.socialL This user is from outside of this forum
                                    lutherblissett@kolektiva.socialL This user is from outside of this forum
                                    lutherblissett@kolektiva.social
                                    wrote sidst redigeret af
                                    #19

                                    @GossiTheDog WoW! So these fascists thugs got banned with the help of AI? The first time I read AI is useful for something!

                                    1 Reply Last reply
                                    0
                                    • gossithedog@cyberplace.socialG gossithedog@cyberplace.social

                                      Chief of police is going to lose their job methinks after using a report from Microsoft 365 Copilot which had errors throughout.. then lying about it to press. https://www.bbc.co.uk/news/live/c394zlr8e12t

                                      rysiek@mstdn.socialR This user is from outside of this forum
                                      rysiek@mstdn.socialR This user is from outside of this forum
                                      rysiek@mstdn.social
                                      wrote sidst redigeret af
                                      #20

                                      @GossiTheDog

                                      image description:

                                      Screenshot of summary. Text:

                                      The chief constable of West Midlands Police admits AI was used in a report that led to Israeli football fans being banned from a match last year

                                      Maccabi Tel Aviv fans were banned from attending a game against Aston Villa, because the Birmingham Safety Advisory Group - which police are part of - deemed the match "high risk" because of previous unrest

                                      1/3

                                      rysiek@mstdn.socialR 1 Reply Last reply
                                      0
                                      • rysiek@mstdn.socialR rysiek@mstdn.social

                                        @GossiTheDog

                                        image description:

                                        Screenshot of summary. Text:

                                        The chief constable of West Midlands Police admits AI was used in a report that led to Israeli football fans being banned from a match last year

                                        Maccabi Tel Aviv fans were banned from attending a game against Aston Villa, because the Birmingham Safety Advisory Group - which police are part of - deemed the match "high risk" because of previous unrest

                                        1/3

                                        rysiek@mstdn.socialR This user is from outside of this forum
                                        rysiek@mstdn.socialR This user is from outside of this forum
                                        rysiek@mstdn.social
                                        wrote sidst redigeret af
                                        #21

                                        @GossiTheDog

                                        Text continued:

                                        The report referenced a match between Maccabi Tel Aviv and West Ham that never happened - pressed by MPs last week, Chief Constable Craig Guildford insisted that the force "do not use AI", instead blaming a Google search

                                        In an earlier appearance at the Home Affairs Committee, he had blamed "social media scraping" for the incorrect information

                                        2/3

                                        rysiek@mstdn.socialR 1 Reply Last reply
                                        0
                                        • rysiek@mstdn.socialR rysiek@mstdn.social

                                          @GossiTheDog

                                          Text continued:

                                          The report referenced a match between Maccabi Tel Aviv and West Ham that never happened - pressed by MPs last week, Chief Constable Craig Guildford insisted that the force "do not use AI", instead blaming a Google search

                                          In an earlier appearance at the Home Affairs Committee, he had blamed "social media scraping" for the incorrect information

                                          2/3

                                          rysiek@mstdn.socialR This user is from outside of this forum
                                          rysiek@mstdn.socialR This user is from outside of this forum
                                          rysiek@mstdn.social
                                          wrote sidst redigeret af
                                          #22

                                          @GossiTheDog

                                          Text continued:

                                          However, in a letter to the Home Affairs Select Committee released today, he admits the mistake was the "result of a use of Microsoft Co Pilot" - an AI tool

                                          Home Secretary Shabana Mahmood is "considering" the findings of a report into the decision, and will deliver a statement to the Commons later

                                          3/3

                                          1 Reply Last reply
                                          0
                                          Svar
                                          • Svar som emne
                                          Login for at svare
                                          • Ældste til nyeste
                                          • Nyeste til ældste
                                          • Most Votes


                                          • Log ind

                                          • Har du ikke en konto? Tilmeld

                                          • Login or register to search.
                                          Powered by NodeBB Contributors
                                          Graciously hosted by data.coop
                                          • First post
                                            Last post
                                          0
                                          • Hjem
                                          • Seneste
                                          • Etiketter
                                          • Populære
                                          • Verden
                                          • Bruger
                                          • Grupper